Oct 07 13:07:07 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 13:07:07 crc restorecon[4655]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:07 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:07:08 crc restorecon[4655]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 13:07:09 crc kubenswrapper[4677]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:07:09 crc kubenswrapper[4677]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 13:07:09 crc kubenswrapper[4677]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:07:09 crc kubenswrapper[4677]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:07:09 crc kubenswrapper[4677]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 13:07:09 crc kubenswrapper[4677]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.040590 4677 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048023 4677 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048088 4677 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048103 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048113 4677 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048122 4677 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048131 4677 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048141 4677 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048151 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048176 4677 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048198 4677 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048210 4677 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048221 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048230 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048237 4677 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048247 4677 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048255 4677 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048263 4677 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048271 4677 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048281 4677 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048291 4677 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048298 4677 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048308 4677 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048315 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048323 4677 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048331 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048338 4677 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048346 4677 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048353 4677 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048361 4677 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048368 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048386 4677 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048396 4677 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048404 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048412 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048420 4677 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048461 4677 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048470 4677 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048478 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048492 4677 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048502 4677 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048511 4677 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048520 4677 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048530 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048538 4677 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048546 4677 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048554 4677 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048565 4677 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048575 4677 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048585 4677 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048596 4677 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048605 4677 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048613 4677 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048621 4677 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048629 4677 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048638 4677 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048646 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048655 4677 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048663 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048671 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048679 4677 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048686 4677 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048694 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048701 4677 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048709 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048717 4677 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048724 4677 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048731 4677 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048739 4677 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048747 4677 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048754 4677 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.048762 4677 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.048961 4677 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.048992 4677 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049012 4677 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049027 4677 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049040 4677 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049049 4677 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049061 4677 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049073 4677 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049083 4677 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049092 4677 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049101 4677 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049114 4677 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049124 4677 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049134 4677 flags.go:64] FLAG: --cgroup-root="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049144 4677 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049161 4677 flags.go:64] FLAG: --client-ca-file="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049182 4677 flags.go:64] FLAG: --cloud-config="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049194 4677 flags.go:64] FLAG: --cloud-provider="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049206 4677 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049224 4677 flags.go:64] FLAG: --cluster-domain="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049232 4677 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049242 4677 flags.go:64] FLAG: --config-dir="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049251 4677 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049261 4677 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049282 4677 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049293 4677 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049304 4677 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049316 4677 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049328 4677 flags.go:64] FLAG: --contention-profiling="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049339 4677 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049349 4677 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049360 4677 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049371 4677 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049385 4677 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049396 4677 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049407 4677 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049418 4677 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049464 4677 flags.go:64] FLAG: --enable-server="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049476 4677 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049491 4677 flags.go:64] FLAG: --event-burst="100" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049503 4677 flags.go:64] FLAG: --event-qps="50" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049514 4677 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049523 4677 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049532 4677 flags.go:64] FLAG: --eviction-hard="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049544 4677 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049553 4677 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049562 4677 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049574 4677 flags.go:64] FLAG: --eviction-soft="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049583 4677 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049592 4677 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049601 4677 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049610 4677 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049618 4677 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049627 4677 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049636 4677 flags.go:64] FLAG: --feature-gates="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049647 4677 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049656 4677 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049666 4677 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049675 4677 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049684 4677 flags.go:64] FLAG: --healthz-port="10248" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049694 4677 flags.go:64] FLAG: --help="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049702 4677 flags.go:64] FLAG: --hostname-override="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049711 4677 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049720 4677 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049730 4677 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049740 4677 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049748 4677 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049757 4677 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049766 4677 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049775 4677 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049784 4677 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049793 4677 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049804 4677 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049812 4677 flags.go:64] FLAG: --kube-reserved="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049821 4677 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049830 4677 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049839 4677 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049848 4677 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049857 4677 flags.go:64] FLAG: --lock-file="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049865 4677 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049874 4677 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049883 4677 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049898 4677 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049908 4677 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049917 4677 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049925 4677 flags.go:64] FLAG: --logging-format="text" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049934 4677 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049944 4677 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049953 4677 flags.go:64] FLAG: --manifest-url="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049962 4677 flags.go:64] FLAG: --manifest-url-header="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049974 4677 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049983 4677 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.049994 4677 flags.go:64] FLAG: --max-pods="110" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050003 4677 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050012 4677 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050021 4677 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050031 4677 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050040 4677 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050049 4677 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050068 4677 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050088 4677 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050097 4677 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050106 4677 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050115 4677 flags.go:64] FLAG: --pod-cidr="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050124 4677 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050137 4677 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050146 4677 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050157 4677 flags.go:64] FLAG: --pods-per-core="0" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050168 4677 flags.go:64] FLAG: --port="10250" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050180 4677 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050191 4677 flags.go:64] FLAG: --provider-id="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050203 4677 flags.go:64] FLAG: --qos-reserved="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050213 4677 flags.go:64] FLAG: --read-only-port="10255" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050223 4677 flags.go:64] FLAG: --register-node="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050232 4677 flags.go:64] FLAG: --register-schedulable="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050241 4677 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050256 4677 flags.go:64] FLAG: --registry-burst="10" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050266 4677 flags.go:64] FLAG: --registry-qps="5" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050275 4677 flags.go:64] FLAG: --reserved-cpus="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050284 4677 flags.go:64] FLAG: --reserved-memory="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050296 4677 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050305 4677 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050314 4677 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050323 4677 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050332 4677 flags.go:64] FLAG: --runonce="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050341 4677 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050350 4677 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050360 4677 flags.go:64] FLAG: --seccomp-default="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050369 4677 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050377 4677 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050387 4677 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050396 4677 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050409 4677 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050418 4677 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050427 4677 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050463 4677 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050472 4677 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050482 4677 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050491 4677 flags.go:64] FLAG: --system-cgroups="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050500 4677 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050514 4677 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050523 4677 flags.go:64] FLAG: --tls-cert-file="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050532 4677 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050544 4677 flags.go:64] FLAG: --tls-min-version="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050553 4677 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050562 4677 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050571 4677 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050579 4677 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050589 4677 flags.go:64] FLAG: --v="2" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050602 4677 flags.go:64] FLAG: --version="false" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050613 4677 flags.go:64] FLAG: --vmodule="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050625 4677 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.050634 4677 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050854 4677 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050866 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050876 4677 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050887 4677 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050897 4677 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050907 4677 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050916 4677 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050964 4677 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050975 4677 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050983 4677 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.050991 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051003 4677 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051010 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051018 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051026 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051033 4677 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051042 4677 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051050 4677 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051060 4677 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051069 4677 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051076 4677 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051085 4677 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051092 4677 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051100 4677 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051107 4677 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051115 4677 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051123 4677 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051130 4677 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051139 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051148 4677 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051158 4677 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051168 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051181 4677 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051194 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051203 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051213 4677 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051222 4677 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051230 4677 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051239 4677 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051250 4677 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051258 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051265 4677 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051273 4677 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051283 4677 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051291 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051298 4677 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051306 4677 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051313 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051321 4677 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051328 4677 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051336 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051345 4677 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051353 4677 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051360 4677 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051368 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051375 4677 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051383 4677 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051390 4677 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051399 4677 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051406 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051414 4677 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051448 4677 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051458 4677 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051477 4677 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051485 4677 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051495 4677 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051505 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051514 4677 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051523 4677 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051531 4677 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.051539 4677 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.051570 4677 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.062920 4677 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.062974 4677 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063117 4677 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063131 4677 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063140 4677 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063153 4677 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063165 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063174 4677 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063184 4677 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063193 4677 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063201 4677 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063209 4677 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063219 4677 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063228 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063236 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063244 4677 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063254 4677 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063262 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063270 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063278 4677 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063286 4677 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063294 4677 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063302 4677 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063310 4677 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063317 4677 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063325 4677 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063333 4677 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063342 4677 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063350 4677 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063358 4677 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063366 4677 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063376 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063384 4677 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063392 4677 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063400 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063407 4677 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063415 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063423 4677 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063459 4677 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063467 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063475 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063484 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063493 4677 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063505 4677 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063514 4677 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063524 4677 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063534 4677 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063542 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063578 4677 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063589 4677 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063597 4677 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063605 4677 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063614 4677 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063624 4677 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063634 4677 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063642 4677 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063650 4677 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063659 4677 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063666 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063674 4677 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063681 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063689 4677 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063696 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063704 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063713 4677 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063724 4677 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063734 4677 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063745 4677 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063753 4677 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063761 4677 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063769 4677 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063776 4677 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.063784 4677 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.063798 4677 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064066 4677 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064079 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064089 4677 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064097 4677 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064105 4677 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064113 4677 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064122 4677 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064130 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064140 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064148 4677 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064156 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064163 4677 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064174 4677 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064183 4677 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064192 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064200 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064208 4677 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064216 4677 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064224 4677 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064232 4677 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064240 4677 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064247 4677 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064255 4677 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064263 4677 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064272 4677 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064279 4677 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064287 4677 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064294 4677 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064302 4677 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064311 4677 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064318 4677 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064326 4677 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064336 4677 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064345 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064353 4677 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064363 4677 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064373 4677 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064381 4677 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064390 4677 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064397 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064406 4677 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064413 4677 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064421 4677 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064454 4677 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064462 4677 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064470 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064478 4677 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064486 4677 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064493 4677 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064501 4677 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064508 4677 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064516 4677 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064523 4677 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064531 4677 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064538 4677 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064546 4677 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064555 4677 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064566 4677 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064575 4677 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064584 4677 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064592 4677 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064601 4677 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064609 4677 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064617 4677 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064625 4677 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064633 4677 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064642 4677 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064649 4677 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064658 4677 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064666 4677 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.064674 4677 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.064686 4677 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.065791 4677 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.072809 4677 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.072954 4677 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.076490 4677 server.go:997] "Starting client certificate rotation" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.076522 4677 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.079012 4677 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-28 17:25:40.629239873 +0000 UTC Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.079386 4677 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1252h18m31.549861237s for next certificate rotation Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.107710 4677 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.112656 4677 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.130824 4677 log.go:25] "Validated CRI v1 runtime API" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.169286 4677 log.go:25] "Validated CRI v1 image API" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.171622 4677 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.179079 4677 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-13-03-04-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.179142 4677 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.204065 4677 manager.go:217] Machine: {Timestamp:2025-10-07 13:07:09.201661279 +0000 UTC m=+0.687370464 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:68c6c527-b248-4c1e-9fd2-b44685e78bcf BootID:2461c0fe-8a8b-483d-90f2-2a3d8d7aca47 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:61:d3:88 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:61:d3:88 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:cc:c4:de Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:89:52:28 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:fa:be:9e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:62:80:8a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:c1:a5:96:52:01 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:95:05:30:27:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.204572 4677 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.204810 4677 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.205487 4677 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.205780 4677 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.205825 4677 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.206129 4677 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.206149 4677 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.206795 4677 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.206841 4677 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.207117 4677 state_mem.go:36] "Initialized new in-memory state store" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.207278 4677 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.214009 4677 kubelet.go:418] "Attempting to sync node with API server" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.214053 4677 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.214089 4677 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.214109 4677 kubelet.go:324] "Adding apiserver pod source" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.214132 4677 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.218500 4677 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.218888 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.218892 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.219021 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.219046 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.219661 4677 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.222934 4677 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224597 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224639 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224655 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224669 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224691 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224704 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224717 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224741 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224757 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224773 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224791 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.224804 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.225705 4677 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.226518 4677 server.go:1280] "Started kubelet" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.226734 4677 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.226955 4677 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.227854 4677 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.227935 4677 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:09 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.231484 4677 server.go:460] "Adding debug handlers to kubelet server" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.232588 4677 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.232645 4677 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.234210 4677 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 12:08:07.640128752 +0000 UTC Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.234425 4677 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1391h0m58.40571467s for next certificate rotation Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.234729 4677 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.234772 4677 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.234943 4677 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.241629 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.241778 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.234936 4677 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c375d86b4423c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 13:07:09.2264679 +0000 UTC m=+0.712177075,LastTimestamp:2025-10-07 13:07:09.2264679 +0000 UTC m=+0.712177075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.241870 4677 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.242365 4677 factory.go:55] Registering systemd factory Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.242385 4677 factory.go:221] Registration of the systemd container factory successfully Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.243186 4677 factory.go:153] Registering CRI-O factory Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.243243 4677 factory.go:221] Registration of the crio container factory successfully Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.243407 4677 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.243506 4677 factory.go:103] Registering Raw factory Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.243544 4677 manager.go:1196] Started watching for new ooms in manager Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.243010 4677 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.245807 4677 manager.go:319] Starting recovery of all containers Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.257255 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.257645 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.257820 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.257960 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.258099 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.258285 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.258422 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.258593 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.258780 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.258968 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.259138 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.260808 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.260972 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.261185 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.261351 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.261626 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.261804 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.261943 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.262068 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.262192 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.262326 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.262534 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.262681 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.262808 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.262934 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.263055 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.263205 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.263339 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.263498 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.263650 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.263778 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.263944 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.264164 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.264379 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.264541 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.264689 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.264821 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.264992 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.265201 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.265379 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.265547 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.265679 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.265832 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.265965 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.266106 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.266327 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.266564 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.266760 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.266946 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.267112 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.267292 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.267467 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.267631 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.268048 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.268192 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.268325 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.268497 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.268631 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.268775 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.268909 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.269034 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.269168 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.269295 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.269456 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.269613 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.269753 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.269889 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.270016 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.270138 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.270271 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.270398 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.270567 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.270730 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.270918 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.271078 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.271262 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.266496 4677 manager.go:324] Recovery completed Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.271425 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.271787 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.271977 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.272131 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.272276 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.272408 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.272621 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.272757 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.272910 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273041 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273165 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273294 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273420 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273581 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273732 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273870 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.273998 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.274179 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.274349 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.274527 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.274662 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.274811 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.274991 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.275166 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.275335 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.275570 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.275755 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.275967 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.276155 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.276361 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.276639 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.276819 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.277003 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.277218 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.277408 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.277613 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.277781 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.277914 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.278044 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.278207 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.278397 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.278628 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.278810 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279043 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279248 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279406 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279561 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279659 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279692 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279726 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279752 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279809 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279840 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279860 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279881 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279901 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279920 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279940 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279958 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279978 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.279997 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.280020 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.280041 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.284600 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285110 4677 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285188 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285218 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285241 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285262 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285283 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285306 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285325 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285345 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285365 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285386 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285423 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285494 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285517 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285537 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285558 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.285576 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286047 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286088 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286115 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286132 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286148 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286168 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286182 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286204 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286218 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286233 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286252 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286267 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286294 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286307 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286321 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286339 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286353 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286388 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286409 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286451 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286482 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286501 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286521 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286535 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286549 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286591 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286605 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286622 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286638 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286652 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286670 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286684 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286703 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286718 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286732 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286749 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286763 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286780 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286793 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286806 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286824 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286837 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286857 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286870 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286885 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286903 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286919 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286939 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286952 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286967 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286986 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.286999 4677 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.287012 4677 reconstruct.go:97] "Volume reconstruction finished" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.287021 4677 reconciler.go:26] "Reconciler: start to sync state" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.289739 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.289794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.289807 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.291113 4677 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.291136 4677 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.291330 4677 state_mem.go:36] "Initialized new in-memory state store" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.299908 4677 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.301668 4677 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.301749 4677 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.301802 4677 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.302025 4677 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.304479 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.304611 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.306322 4677 policy_none.go:49] "None policy: Start" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.307284 4677 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.307311 4677 state_mem.go:35] "Initializing new in-memory state store" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.342032 4677 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.370726 4677 manager.go:334] "Starting Device Plugin manager" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.371026 4677 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.371043 4677 server.go:79] "Starting device plugin registration server" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.371529 4677 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.371559 4677 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.371785 4677 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.371891 4677 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.371901 4677 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.378681 4677 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.402456 4677 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.402596 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.404019 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.404063 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.404076 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.404276 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.404454 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.404506 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405284 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405314 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405325 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405313 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405448 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405478 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405662 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.405696 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406134 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406166 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406178 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406350 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406368 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406386 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406396 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406468 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.406498 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408334 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408358 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408366 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408365 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408496 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408463 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408514 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408540 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.408507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409176 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409196 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409200 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409229 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409221 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409403 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.409424 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.410026 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.410050 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.410063 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.444412 4677 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.471667 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.473055 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.473092 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.473101 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.473124 4677 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.473604 4677 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489416 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489483 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489506 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489526 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489634 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489744 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489845 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489907 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.489938 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.490037 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.490069 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.490148 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.490207 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.490281 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.490309 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591519 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591612 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591668 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591716 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591740 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591857 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591782 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591814 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591831 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.591766 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592068 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592102 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592134 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592165 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592200 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592228 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592233 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592261 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592231 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592294 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592320 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592398 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592415 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592507 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592525 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592561 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592593 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592602 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592651 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.592771 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.673903 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.675378 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.675462 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.675477 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.675500 4677 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.675867 4677 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.738613 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.745915 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.766502 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.781313 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: I1007 13:07:09.786596 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.792480 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c45a5ca1403c2d54fad0faf3c81c121626a7816de3c304ff8253fec6bf56c680 WatchSource:0}: Error finding container c45a5ca1403c2d54fad0faf3c81c121626a7816de3c304ff8253fec6bf56c680: Status 404 returned error can't find the container with id c45a5ca1403c2d54fad0faf3c81c121626a7816de3c304ff8253fec6bf56c680 Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.794982 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f402cbbb73cc2b25101aebbc90f86b5930e1df7289c4e5b88a181808f3b6c3d1 WatchSource:0}: Error finding container f402cbbb73cc2b25101aebbc90f86b5930e1df7289c4e5b88a181808f3b6c3d1: Status 404 returned error can't find the container with id f402cbbb73cc2b25101aebbc90f86b5930e1df7289c4e5b88a181808f3b6c3d1 Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.804140 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d2cfb91ea77680ef298b19093856207014ee6ff026cc7b7bd98c8d88bcf5bc62 WatchSource:0}: Error finding container d2cfb91ea77680ef298b19093856207014ee6ff026cc7b7bd98c8d88bcf5bc62: Status 404 returned error can't find the container with id d2cfb91ea77680ef298b19093856207014ee6ff026cc7b7bd98c8d88bcf5bc62 Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.813514 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-274134b610e230a6b93610a8fecf30824fc4dfe1de9972d85f52556c64856a01 WatchSource:0}: Error finding container 274134b610e230a6b93610a8fecf30824fc4dfe1de9972d85f52556c64856a01: Status 404 returned error can't find the container with id 274134b610e230a6b93610a8fecf30824fc4dfe1de9972d85f52556c64856a01 Oct 07 13:07:09 crc kubenswrapper[4677]: W1007 13:07:09.816319 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5262d9e814704bc7b8130be2d432b20c6ee4de30e4251aabe2956c44b4e610ca WatchSource:0}: Error finding container 5262d9e814704bc7b8130be2d432b20c6ee4de30e4251aabe2956c44b4e610ca: Status 404 returned error can't find the container with id 5262d9e814704bc7b8130be2d432b20c6ee4de30e4251aabe2956c44b4e610ca Oct 07 13:07:09 crc kubenswrapper[4677]: E1007 13:07:09.845719 4677 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.076054 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.077886 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.078377 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.078399 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.078470 4677 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:07:10 crc kubenswrapper[4677]: E1007 13:07:10.079118 4677 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Oct 07 13:07:10 crc kubenswrapper[4677]: W1007 13:07:10.156064 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:10 crc kubenswrapper[4677]: E1007 13:07:10.156207 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:10 crc kubenswrapper[4677]: W1007 13:07:10.197297 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:10 crc kubenswrapper[4677]: E1007 13:07:10.197365 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.229514 4677 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:10 crc kubenswrapper[4677]: W1007 13:07:10.306178 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:10 crc kubenswrapper[4677]: E1007 13:07:10.306289 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.308986 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d2cfb91ea77680ef298b19093856207014ee6ff026cc7b7bd98c8d88bcf5bc62"} Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.310664 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f402cbbb73cc2b25101aebbc90f86b5930e1df7289c4e5b88a181808f3b6c3d1"} Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.312067 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c45a5ca1403c2d54fad0faf3c81c121626a7816de3c304ff8253fec6bf56c680"} Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.313695 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5262d9e814704bc7b8130be2d432b20c6ee4de30e4251aabe2956c44b4e610ca"} Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.314708 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"274134b610e230a6b93610a8fecf30824fc4dfe1de9972d85f52556c64856a01"} Oct 07 13:07:10 crc kubenswrapper[4677]: W1007 13:07:10.584467 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:10 crc kubenswrapper[4677]: E1007 13:07:10.584543 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:10 crc kubenswrapper[4677]: E1007 13:07:10.646856 4677 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.879202 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.880356 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.880398 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.880407 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:10 crc kubenswrapper[4677]: I1007 13:07:10.880425 4677 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:07:10 crc kubenswrapper[4677]: E1007 13:07:10.881509 4677 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.228759 4677 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.319512 4677 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52" exitCode=0 Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.319596 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52"} Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.319609 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.321813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.321855 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.321874 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.322501 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386"} Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.322532 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781"} Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.322547 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963"} Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.326047 4677 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0" exitCode=0 Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.326107 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0"} Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.326226 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.327691 4677 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6" exitCode=0 Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.327741 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6"} Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.327838 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.330665 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.330695 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.330710 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.332262 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.332325 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.332353 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.333113 4677 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706" exitCode=0 Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.333184 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706"} Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.333342 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.334748 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.334784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.334816 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.337643 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.342138 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.342169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:11 crc kubenswrapper[4677]: I1007 13:07:11.342180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.228938 4677 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:12 crc kubenswrapper[4677]: E1007 13:07:12.247301 4677 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.346477 4677 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5" exitCode=0 Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.346537 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.346801 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.349073 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.349114 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.349241 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.352998 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.353041 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.353053 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.353114 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.353686 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.353710 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.353721 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.355994 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"373e947054a32f5e1ecb5b66d2a5e668a14a1c76b2329cc4a60ddee65c80a3e0"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.356054 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.356838 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.356866 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.356876 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.359338 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.359418 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.360048 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.360072 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.360081 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.363885 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.363976 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.363991 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.364000 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2"} Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.482285 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.483469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.483515 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.483529 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:12 crc kubenswrapper[4677]: I1007 13:07:12.483555 4677 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:07:12 crc kubenswrapper[4677]: E1007 13:07:12.483986 4677 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Oct 07 13:07:12 crc kubenswrapper[4677]: W1007 13:07:12.495168 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:12 crc kubenswrapper[4677]: E1007 13:07:12.495228 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:12 crc kubenswrapper[4677]: W1007 13:07:12.894878 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Oct 07 13:07:12 crc kubenswrapper[4677]: E1007 13:07:12.895017 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.369682 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2e5209710f5d3ab3676742d78170c9463612cae9d1c4c49a63ae23e2e4714d95"} Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.369751 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.371068 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.371102 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.371118 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.374060 4677 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17" exitCode=0 Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.374395 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.374412 4677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.374412 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.374500 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.374411 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17"} Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.374400 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.375298 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.375329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.375343 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.376648 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.376675 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.376696 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.376888 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.376941 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.376956 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.377336 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.377383 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:13 crc kubenswrapper[4677]: I1007 13:07:13.377405 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.381474 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.384276 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2"} Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.384332 4677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.384345 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b"} Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.384366 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6"} Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.384391 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.385804 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.385863 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:14 crc kubenswrapper[4677]: I1007 13:07:14.385890 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.140291 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.140539 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.142576 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.142634 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.142659 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.392903 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296"} Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.392953 4677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.392978 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.393030 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.392964 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47"} Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.394461 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.394480 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.394511 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.394520 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.394529 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.394599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.684509 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.686315 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.686364 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.686377 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:15 crc kubenswrapper[4677]: I1007 13:07:15.686413 4677 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:07:16 crc kubenswrapper[4677]: I1007 13:07:16.398657 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:16 crc kubenswrapper[4677]: I1007 13:07:16.400534 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:16 crc kubenswrapper[4677]: I1007 13:07:16.400582 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:16 crc kubenswrapper[4677]: I1007 13:07:16.400602 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:17 crc kubenswrapper[4677]: I1007 13:07:17.867584 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:17 crc kubenswrapper[4677]: I1007 13:07:17.867898 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:17 crc kubenswrapper[4677]: I1007 13:07:17.869329 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:17 crc kubenswrapper[4677]: I1007 13:07:17.869686 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:17 crc kubenswrapper[4677]: I1007 13:07:17.869809 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:17 crc kubenswrapper[4677]: I1007 13:07:17.869838 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.404283 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.405816 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.405891 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.405910 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.641523 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.641775 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.643501 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.643554 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:18 crc kubenswrapper[4677]: I1007 13:07:18.643568 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.366022 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 13:07:19 crc kubenswrapper[4677]: E1007 13:07:19.378754 4677 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.389421 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.389802 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.391205 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.391242 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.391254 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.406669 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.407939 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.408002 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.408028 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.489369 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.489612 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.490924 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.490965 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.490982 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:19 crc kubenswrapper[4677]: I1007 13:07:19.495257 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:20 crc kubenswrapper[4677]: I1007 13:07:20.410001 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:20 crc kubenswrapper[4677]: I1007 13:07:20.410397 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:20 crc kubenswrapper[4677]: I1007 13:07:20.411333 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:20 crc kubenswrapper[4677]: I1007 13:07:20.411373 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:20 crc kubenswrapper[4677]: I1007 13:07:20.411386 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:20 crc kubenswrapper[4677]: I1007 13:07:20.416643 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:21 crc kubenswrapper[4677]: I1007 13:07:21.413249 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:21 crc kubenswrapper[4677]: I1007 13:07:21.414666 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:21 crc kubenswrapper[4677]: I1007 13:07:21.414733 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:21 crc kubenswrapper[4677]: I1007 13:07:21.414753 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:22 crc kubenswrapper[4677]: I1007 13:07:22.060023 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:22 crc kubenswrapper[4677]: I1007 13:07:22.389594 4677 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 13:07:22 crc kubenswrapper[4677]: I1007 13:07:22.389695 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:07:22 crc kubenswrapper[4677]: I1007 13:07:22.420474 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:22 crc kubenswrapper[4677]: I1007 13:07:22.422211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:22 crc kubenswrapper[4677]: I1007 13:07:22.422272 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:22 crc kubenswrapper[4677]: I1007 13:07:22.422290 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:23 crc kubenswrapper[4677]: E1007 13:07:23.142586 4677 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.186c375d86b4423c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 13:07:09.2264679 +0000 UTC m=+0.712177075,LastTimestamp:2025-10-07 13:07:09.2264679 +0000 UTC m=+0.712177075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 13:07:23 crc kubenswrapper[4677]: W1007 13:07:23.168840 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.168947 4677 trace.go:236] Trace[471777182]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:07:13.167) (total time: 10001ms): Oct 07 13:07:23 crc kubenswrapper[4677]: Trace[471777182]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:07:23.168) Oct 07 13:07:23 crc kubenswrapper[4677]: Trace[471777182]: [10.001803364s] [10.001803364s] END Oct 07 13:07:23 crc kubenswrapper[4677]: E1007 13:07:23.168970 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.230017 4677 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.422881 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.424180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.424220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.424234 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:23 crc kubenswrapper[4677]: W1007 13:07:23.522142 4677 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.522225 4677 trace.go:236] Trace[1984727193]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:07:13.520) (total time: 10001ms): Oct 07 13:07:23 crc kubenswrapper[4677]: Trace[1984727193]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:07:23.522) Oct 07 13:07:23 crc kubenswrapper[4677]: Trace[1984727193]: [10.001252745s] [10.001252745s] END Oct 07 13:07:23 crc kubenswrapper[4677]: E1007 13:07:23.522246 4677 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.665239 4677 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.665354 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.671618 4677 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 13:07:23 crc kubenswrapper[4677]: I1007 13:07:23.671708 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.427402 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.429759 4677 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2e5209710f5d3ab3676742d78170c9463612cae9d1c4c49a63ae23e2e4714d95" exitCode=255 Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.429802 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2e5209710f5d3ab3676742d78170c9463612cae9d1c4c49a63ae23e2e4714d95"} Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.429948 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.430662 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.430712 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.430731 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:24 crc kubenswrapper[4677]: I1007 13:07:24.431613 4677 scope.go:117] "RemoveContainer" containerID="2e5209710f5d3ab3676742d78170c9463612cae9d1c4c49a63ae23e2e4714d95" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.434747 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.435548 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.438483 4677 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d" exitCode=255 Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.438544 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d"} Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.438621 4677 scope.go:117] "RemoveContainer" containerID="2e5209710f5d3ab3676742d78170c9463612cae9d1c4c49a63ae23e2e4714d95" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.438735 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.442287 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.442364 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.442424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:25 crc kubenswrapper[4677]: I1007 13:07:25.443826 4677 scope.go:117] "RemoveContainer" containerID="8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d" Oct 07 13:07:25 crc kubenswrapper[4677]: E1007 13:07:25.444494 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 13:07:26 crc kubenswrapper[4677]: I1007 13:07:26.442371 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.867726 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.867930 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.869198 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.869238 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.869256 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.870041 4677 scope.go:117] "RemoveContainer" containerID="8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d" Oct 07 13:07:27 crc kubenswrapper[4677]: E1007 13:07:27.870330 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.873276 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:27 crc kubenswrapper[4677]: I1007 13:07:27.918313 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.449317 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.450522 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.450567 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.450580 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.451180 4677 scope.go:117] "RemoveContainer" containerID="8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d" Oct 07 13:07:28 crc kubenswrapper[4677]: E1007 13:07:28.451357 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.456264 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.664169 4677 trace.go:236] Trace[724504227]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:07:17.763) (total time: 10901ms): Oct 07 13:07:28 crc kubenswrapper[4677]: Trace[724504227]: ---"Objects listed" error: 10901ms (13:07:28.664) Oct 07 13:07:28 crc kubenswrapper[4677]: Trace[724504227]: [10.901059842s] [10.901059842s] END Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.664479 4677 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.664924 4677 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.666388 4677 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 13:07:28 crc kubenswrapper[4677]: E1007 13:07:28.667508 4677 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 13:07:28 crc kubenswrapper[4677]: E1007 13:07:28.668198 4677 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.676319 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 13:07:28 crc kubenswrapper[4677]: I1007 13:07:28.694247 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.211792 4677 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.226782 4677 apiserver.go:52] "Watching apiserver" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.228733 4677 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.229098 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-c2h2k","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.229454 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.229495 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.229557 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.229617 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.229669 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.229753 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.230120 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.230216 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.230281 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.230165 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.231906 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.233157 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.233264 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.233413 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.233624 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.234308 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.234311 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.235100 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.235279 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.235402 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.235546 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.235713 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.236228 4677 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.262902 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268191 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268248 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268280 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268305 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268325 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268349 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268370 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268390 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268414 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268452 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268474 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268497 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268518 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268541 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268590 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268586 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268615 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268638 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268664 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.268872 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.269133 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.269169 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.269203 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.269581 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.269841 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.270071 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.270308 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271112 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271328 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271549 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271601 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271627 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271648 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271667 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271687 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271730 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271756 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271772 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271790 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271809 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271826 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271846 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271864 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271880 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271899 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271917 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271936 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271953 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271974 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.271992 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272015 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272030 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272049 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272067 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272086 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272102 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272119 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272137 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272151 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272170 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272188 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272205 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272221 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272239 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272258 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272272 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272298 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272316 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272334 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272353 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272370 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272388 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272404 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272422 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272456 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272471 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272491 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272511 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272529 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272543 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272560 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272578 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272601 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272623 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272642 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272658 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272677 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272696 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272712 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272733 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272756 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272775 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272798 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272815 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272835 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272853 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272872 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272890 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272905 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272923 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272941 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272964 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272980 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.272999 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273016 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273032 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273050 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273068 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273086 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273105 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273123 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273141 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273158 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273179 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273158 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273198 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273339 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273401 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273427 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273532 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273612 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273679 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273690 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273712 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273744 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273795 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273812 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273882 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273943 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.273995 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274054 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274056 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274141 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274169 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274207 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274233 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274264 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274279 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274292 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274322 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274294 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274471 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274516 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274547 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274563 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274578 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274655 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274664 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274665 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274689 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274714 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274740 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274769 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274793 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274822 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274848 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274856 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274880 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274904 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274927 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274952 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274964 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274872 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.274976 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275020 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275183 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275206 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275214 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275277 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275454 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275455 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275486 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275701 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.275827 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:07:29.775801443 +0000 UTC m=+21.261510558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275856 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275954 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.276147 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.275005 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.279290 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.279633 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.279825 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.279924 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.280057 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.280349 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.280574 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.280645 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.281172 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.281342 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.281699 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.281825 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.281960 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.283602 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.283968 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.283984 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.284118 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.285196 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.286925 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.287026 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.287395 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.287516 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.287745 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.287870 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.287968 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.288033 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.290315 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.290409 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.290542 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.290886 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.291185 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.291354 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.293405 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.296542 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.299820 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.300120 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.300337 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.301048 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.301582 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.302128 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.302578 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.302963 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.303270 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.303746 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.304460 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.304774 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.304997 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.306098 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.306756 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.306907 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.307026 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.307129 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.307234 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.308382 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.338788 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.326179 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340014 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340077 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340120 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340156 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340181 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340209 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340239 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.307589 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.307569 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.307666 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.307682 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.308198 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.304789 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.308333 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.308575 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.308715 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.308782 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.308977 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.311896 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.312194 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.312683 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.317640 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323552 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323477 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323584 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323645 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323707 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323773 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323800 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.323975 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.324009 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.324172 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.324185 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.324230 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.324348 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.325280 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.336760 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.337302 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.338104 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.338152 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.339962 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340181 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.340633 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.341065 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.341122 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.341460 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.342381 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.342591 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.341671 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.344860 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.346113 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.346058 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.326401 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.346249 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.347581 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.347895 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.347939 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.362177 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363057 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363093 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363134 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363137 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363162 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363215 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363246 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363272 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363294 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363360 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363384 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363409 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363450 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363473 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363494 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363515 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363538 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363542 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363558 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363580 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363603 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363623 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363650 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363669 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363717 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363737 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363757 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363776 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363796 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363816 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363826 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363837 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363861 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363885 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363910 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363931 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363953 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363973 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.363994 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364015 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364037 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364042 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364059 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364080 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364101 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364121 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364141 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364233 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364399 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364445 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364668 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364749 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.364918 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.365156 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.365302 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.365314 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.365446 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.365689 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.365773 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.366339 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.366619 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.366633 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.366856 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.367090 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.367355 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.367860 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.368153 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.368623 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.369096 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.369450 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.369491 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.369769 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.369945 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.370216 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.370252 4677 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.370383 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.370571 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.370727 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.371006 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.371067 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.371110 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.372939 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.373575 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.375363 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.375575 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.375983 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.376268 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.376275 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.376792 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.377176 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.377651 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.377820 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.378285 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.378444 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.378620 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4gv\" (UniqueName: \"kubernetes.io/projected/d6a7b491-6ed9-4906-8d2d-d8913a581b95-kube-api-access-gh4gv\") pod \"node-resolver-c2h2k\" (UID: \"d6a7b491-6ed9-4906-8d2d-d8913a581b95\") " pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.378871 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.378983 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.379061 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.379334 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.379337 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.379519 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.379629 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.379755 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.379836 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:29.879810129 +0000 UTC m=+21.365519314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.379890 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d6a7b491-6ed9-4906-8d2d-d8913a581b95-hosts-file\") pod \"node-resolver-c2h2k\" (UID: \"d6a7b491-6ed9-4906-8d2d-d8913a581b95\") " pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.379955 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.380008 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.380013 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.380035 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.380080 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.380099 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.380301 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.380400 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.380629 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:29.880609933 +0000 UTC m=+21.366319048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381003 4677 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381018 4677 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381027 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381040 4677 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381050 4677 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381062 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381072 4677 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381083 4677 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381094 4677 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381104 4677 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381116 4677 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381125 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381133 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381141 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381151 4677 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381163 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381175 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381186 4677 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381198 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381208 4677 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381211 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381220 4677 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381231 4677 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381247 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381269 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381283 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381292 4677 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381301 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381311 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381323 4677 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381332 4677 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381341 4677 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381350 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381358 4677 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381367 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381379 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381387 4677 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381396 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381404 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381412 4677 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381420 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381442 4677 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381451 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381459 4677 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381468 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381476 4677 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381484 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381493 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381501 4677 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381509 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381520 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381528 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381537 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381546 4677 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381557 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381565 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381575 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381583 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381594 4677 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381603 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381611 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381619 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381627 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381636 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381646 4677 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381654 4677 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381662 4677 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381672 4677 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381682 4677 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381693 4677 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381703 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381715 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381727 4677 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381738 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381746 4677 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381755 4677 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381764 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381772 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381781 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381790 4677 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381798 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381807 4677 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381816 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381824 4677 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381832 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381841 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381853 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381861 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381869 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381878 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381887 4677 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381896 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381904 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381913 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381922 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381930 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381938 4677 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381947 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381956 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381964 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381973 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381981 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381989 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.381997 4677 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382005 4677 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382014 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382022 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382032 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382040 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382049 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382057 4677 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382066 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382075 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.382084 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.384127 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.387724 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.388550 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.388655 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.389861 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.389900 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.391288 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.391707 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.391990 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.392926 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.404690 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.404761 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.404776 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.404855 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:29.904836666 +0000 UTC m=+21.390545781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.404957 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.410399 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.410421 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.410451 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.410503 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:29.910488555 +0000 UTC m=+21.396197670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.410878 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.411774 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.413177 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.414270 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.414503 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.415494 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.417028 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.422628 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.423398 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.425925 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.427321 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.429015 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.436647 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.437779 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.438151 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.438723 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.439661 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.440210 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.440642 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.446061 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.446496 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.448002 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.453606 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.456953 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.457854 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.459067 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.459783 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.460729 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.461267 4677 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.461679 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.463595 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.463974 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.464575 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.465268 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.467094 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.468136 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.468746 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.469915 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.470648 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.471600 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.472211 4677 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.472391 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.473519 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.475870 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.476381 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.477523 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.478197 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.480162 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.480706 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.481232 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.482708 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483177 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483205 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4gv\" (UniqueName: \"kubernetes.io/projected/d6a7b491-6ed9-4906-8d2d-d8913a581b95-kube-api-access-gh4gv\") pod \"node-resolver-c2h2k\" (UID: \"d6a7b491-6ed9-4906-8d2d-d8913a581b95\") " pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483221 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483243 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d6a7b491-6ed9-4906-8d2d-d8913a581b95-hosts-file\") pod \"node-resolver-c2h2k\" (UID: \"d6a7b491-6ed9-4906-8d2d-d8913a581b95\") " pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483286 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483296 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483314 4677 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483322 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483330 4677 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483338 4677 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483347 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483355 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483363 4677 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483371 4677 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483379 4677 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483387 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483396 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483404 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483413 4677 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483421 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483446 4677 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483455 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483464 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483473 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483481 4677 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483491 4677 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483500 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483508 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483516 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483525 4677 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483533 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483542 4677 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483552 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483560 4677 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483568 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483576 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483584 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483592 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483599 4677 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483607 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483615 4677 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483623 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483647 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483655 4677 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483663 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483672 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483680 4677 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483688 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483703 4677 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483712 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483720 4677 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483730 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483743 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483751 4677 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483759 4677 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483800 4677 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483808 4677 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483817 4677 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483828 4677 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483836 4677 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483845 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483854 4677 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483863 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483872 4677 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483881 4677 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483889 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483897 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483906 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483914 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483922 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483930 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483938 4677 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483947 4677 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483955 4677 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483963 4677 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483970 4677 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483979 4677 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483987 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.483995 4677 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484003 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484011 4677 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484020 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484027 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484035 4677 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484039 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484043 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484106 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d6a7b491-6ed9-4906-8d2d-d8913a581b95-hosts-file\") pod \"node-resolver-c2h2k\" (UID: \"d6a7b491-6ed9-4906-8d2d-d8913a581b95\") " pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.484125 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.486211 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.486942 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.497534 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.498206 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4gv\" (UniqueName: \"kubernetes.io/projected/d6a7b491-6ed9-4906-8d2d-d8913a581b95-kube-api-access-gh4gv\") pod \"node-resolver-c2h2k\" (UID: \"d6a7b491-6ed9-4906-8d2d-d8913a581b95\") " pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.498231 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.498872 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.499921 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-r7cnz"] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.504412 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.504628 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29c8j"] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.505373 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.508724 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-czmsr"] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.510076 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.510791 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.511170 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.511362 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.514373 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pjgpx"] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.514493 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.514518 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.514809 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.515010 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.515737 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.516621 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.519883 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.519901 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520088 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520129 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520208 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520055 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520467 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520474 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520639 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520711 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.520854 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.521012 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.521176 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.521281 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.530573 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.544191 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.545201 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.550156 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.550463 4677 scope.go:117] "RemoveContainer" containerID="8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.550612 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.557650 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.561373 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.568595 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.578233 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.586179 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.600940 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.603704 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.614094 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.614391 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c2h2k" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.626113 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.632824 4677 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.633019 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.643094 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: W1007 13:07:29.645537 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a7b491_6ed9_4906_8d2d_d8913a581b95.slice/crio-7ad0f7e2157f90444c2351d7d602b57b8453fd94568b10db2d2a0aff8195c597 WatchSource:0}: Error finding container 7ad0f7e2157f90444c2351d7d602b57b8453fd94568b10db2d2a0aff8195c597: Status 404 returned error can't find the container with id 7ad0f7e2157f90444c2351d7d602b57b8453fd94568b10db2d2a0aff8195c597 Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.659502 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.672969 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.683666 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.686963 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687008 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-bin\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687037 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-cnibin\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687070 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-socket-dir-parent\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687107 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-kubelet\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687177 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-cni-multus\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687211 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-etc-kubernetes\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687286 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-os-release\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687341 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-node-log\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687362 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-hostroot\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687378 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-os-release\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687397 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-multus-certs\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687418 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687456 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-netns\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687473 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-netd\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687493 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/73bebfb3-50b5-48b6-b348-1d1feb6202d2-cni-binary-copy\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687516 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-rootfs\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687570 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-env-overrides\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687588 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-script-lib\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687608 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3458826a-000d-407d-92c8-236d1a05842e-ovn-node-metrics-cert\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687625 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstqd\" (UniqueName: \"kubernetes.io/projected/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-kube-api-access-rstqd\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687650 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687673 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-systemd\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687693 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-cni-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687710 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687728 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-proxy-tls\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687786 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-ovn\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687801 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-log-socket\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687815 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm7l2\" (UniqueName: \"kubernetes.io/projected/3458826a-000d-407d-92c8-236d1a05842e-kube-api-access-vm7l2\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687833 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-mcd-auth-proxy-config\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687849 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687886 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-daemon-config\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.687906 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-conf-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688094 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59cg\" (UniqueName: \"kubernetes.io/projected/73bebfb3-50b5-48b6-b348-1d1feb6202d2-kube-api-access-h59cg\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688125 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-systemd-units\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688140 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-config\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688180 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-system-cni-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688213 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-k8s-cni-cncf-io\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688228 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-cni-bin\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688266 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r59hd\" (UniqueName: \"kubernetes.io/projected/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-kube-api-access-r59hd\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688284 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-slash\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688298 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-etc-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688331 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-ovn-kubernetes\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688348 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-netns\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688363 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-kubelet\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688378 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-var-lib-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688410 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-system-cni-dir\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.688446 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cnibin\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789481 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789574 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-kubelet\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789595 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-var-lib-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789610 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-system-cni-dir\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789623 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cnibin\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789637 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-kubelet\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789651 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789665 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-bin\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789679 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-cnibin\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789693 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-socket-dir-parent\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789715 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-cni-multus\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789728 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-etc-kubernetes\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789743 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-hostroot\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789756 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-os-release\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789771 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-node-log\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789784 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-os-release\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789797 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-multus-certs\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789810 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789825 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-netd\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789840 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-netns\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789858 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-script-lib\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789877 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/73bebfb3-50b5-48b6-b348-1d1feb6202d2-cni-binary-copy\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789892 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-rootfs\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789913 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-env-overrides\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789927 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3458826a-000d-407d-92c8-236d1a05842e-ovn-node-metrics-cert\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789943 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rstqd\" (UniqueName: \"kubernetes.io/projected/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-kube-api-access-rstqd\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789958 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789972 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.789986 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-systemd\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790000 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-cni-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790016 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm7l2\" (UniqueName: \"kubernetes.io/projected/3458826a-000d-407d-92c8-236d1a05842e-kube-api-access-vm7l2\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790029 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-proxy-tls\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790043 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-ovn\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790064 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-log-socket\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790077 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790093 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-mcd-auth-proxy-config\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790115 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-daemon-config\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790129 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-config\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790143 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-conf-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790157 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h59cg\" (UniqueName: \"kubernetes.io/projected/73bebfb3-50b5-48b6-b348-1d1feb6202d2-kube-api-access-h59cg\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790178 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-systemd-units\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790193 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-system-cni-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790209 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-k8s-cni-cncf-io\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790222 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-netns\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790236 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-cni-bin\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790249 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r59hd\" (UniqueName: \"kubernetes.io/projected/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-kube-api-access-r59hd\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790266 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-slash\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790280 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-etc-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790295 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-ovn-kubernetes\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790354 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-ovn-kubernetes\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.790419 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:07:30.790404639 +0000 UTC m=+22.276113754 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790460 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-kubelet\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790483 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-var-lib-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790520 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-system-cni-dir\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790539 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cnibin\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790558 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-kubelet\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790577 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790597 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-bin\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790627 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-cnibin\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790810 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-systemd\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790873 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-cni-multus\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790903 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-socket-dir-parent\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790929 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-hostroot\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.791048 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-cni-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.791125 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-os-release\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.791158 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-node-log\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.791199 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-os-release\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.791228 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-multus-certs\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.791592 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-systemd-units\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.791974 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792025 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-netd\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792056 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-netns\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792252 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-daemon-config\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792297 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-mcd-auth-proxy-config\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792355 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-ovn\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792393 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-log-socket\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792708 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-var-lib-cni-bin\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792727 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-slash\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792774 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-system-cni-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792788 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-config\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792824 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-k8s-cni-cncf-io\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792829 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-multus-conf-dir\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792894 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-host-run-netns\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.790905 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73bebfb3-50b5-48b6-b348-1d1feb6202d2-etc-kubernetes\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792918 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-etc-openvswitch\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792934 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-rootfs\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792958 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.792993 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.793153 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-script-lib\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.793447 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-env-overrides\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.793569 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/73bebfb3-50b5-48b6-b348-1d1feb6202d2-cni-binary-copy\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.793714 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.794096 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-proxy-tls\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.798881 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3458826a-000d-407d-92c8-236d1a05842e-ovn-node-metrics-cert\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.810135 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm7l2\" (UniqueName: \"kubernetes.io/projected/3458826a-000d-407d-92c8-236d1a05842e-kube-api-access-vm7l2\") pod \"ovnkube-node-29c8j\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.812154 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59cg\" (UniqueName: \"kubernetes.io/projected/73bebfb3-50b5-48b6-b348-1d1feb6202d2-kube-api-access-h59cg\") pod \"multus-pjgpx\" (UID: \"73bebfb3-50b5-48b6-b348-1d1feb6202d2\") " pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.812470 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstqd\" (UniqueName: \"kubernetes.io/projected/67f7f734-b59a-447c-b4a5-5aeb78d3a4dc-kube-api-access-rstqd\") pod \"multus-additional-cni-plugins-czmsr\" (UID: \"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\") " pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.815342 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r59hd\" (UniqueName: \"kubernetes.io/projected/7879fa59-a7cb-4d29-ba3a-c91f43bfcba6-kube-api-access-r59hd\") pod \"machine-config-daemon-r7cnz\" (UID: \"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\") " pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.825884 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.837330 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-czmsr" Oct 07 13:07:29 crc kubenswrapper[4677]: W1007 13:07:29.837958 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7879fa59_a7cb_4d29_ba3a_c91f43bfcba6.slice/crio-55bfbc834f2ea1f1be7c55cab27d4ef0f9d2d2119d1e0abdfffda636f42a344f WatchSource:0}: Error finding container 55bfbc834f2ea1f1be7c55cab27d4ef0f9d2d2119d1e0abdfffda636f42a344f: Status 404 returned error can't find the container with id 55bfbc834f2ea1f1be7c55cab27d4ef0f9d2d2119d1e0abdfffda636f42a344f Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.856206 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:29 crc kubenswrapper[4677]: W1007 13:07:29.861324 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f7f734_b59a_447c_b4a5_5aeb78d3a4dc.slice/crio-98947326a135f1d2f4e2d7cd8ae621990ec3d3333873e0f0804c74fd6853b58a WatchSource:0}: Error finding container 98947326a135f1d2f4e2d7cd8ae621990ec3d3333873e0f0804c74fd6853b58a: Status 404 returned error can't find the container with id 98947326a135f1d2f4e2d7cd8ae621990ec3d3333873e0f0804c74fd6853b58a Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.866197 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pjgpx" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.891339 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.891403 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.891473 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.891514 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.891564 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:30.891546889 +0000 UTC m=+22.377256004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.891583 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:30.89157576 +0000 UTC m=+22.377284875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: W1007 13:07:29.897524 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73bebfb3_50b5_48b6_b348_1d1feb6202d2.slice/crio-6fe5779638fd9e45f62e2a9b39d81b3514437c7d0185c72e90dd04f1d5207254 WatchSource:0}: Error finding container 6fe5779638fd9e45f62e2a9b39d81b3514437c7d0185c72e90dd04f1d5207254: Status 404 returned error can't find the container with id 6fe5779638fd9e45f62e2a9b39d81b3514437c7d0185c72e90dd04f1d5207254 Oct 07 13:07:29 crc kubenswrapper[4677]: W1007 13:07:29.905932 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3458826a_000d_407d_92c8_236d1a05842e.slice/crio-3277b1e8f93bc16de57864a4db5b92424ff2e8ca9b2fee80d78fd814453a172f WatchSource:0}: Error finding container 3277b1e8f93bc16de57864a4db5b92424ff2e8ca9b2fee80d78fd814453a172f: Status 404 returned error can't find the container with id 3277b1e8f93bc16de57864a4db5b92424ff2e8ca9b2fee80d78fd814453a172f Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.992211 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:29 crc kubenswrapper[4677]: I1007 13:07:29.992275 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992408 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992489 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992507 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992548 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992585 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992627 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992555 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:30.992539934 +0000 UTC m=+22.478249059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:29 crc kubenswrapper[4677]: E1007 13:07:29.992737 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:30.99272365 +0000 UTC m=+22.478432765 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.302691 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:30 crc kubenswrapper[4677]: E1007 13:07:30.302821 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.462178 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7eba6bf3771abd49fcfb37fabf19ad339d65c71521006a0f60980fc03057cb5c"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.463340 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.463361 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f1a906f43e92bb9ceded06680f2aade4e7d26db67e10470fca972737a7b90bd4"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.465077 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerStarted","Data":"6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.465098 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerStarted","Data":"6fe5779638fd9e45f62e2a9b39d81b3514437c7d0185c72e90dd04f1d5207254"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.466369 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a" exitCode=0 Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.466408 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.466423 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"3277b1e8f93bc16de57864a4db5b92424ff2e8ca9b2fee80d78fd814453a172f"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.468925 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.468958 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.468969 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"55bfbc834f2ea1f1be7c55cab27d4ef0f9d2d2119d1e0abdfffda636f42a344f"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.475086 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c2h2k" event={"ID":"d6a7b491-6ed9-4906-8d2d-d8913a581b95","Type":"ContainerStarted","Data":"079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.475126 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c2h2k" event={"ID":"d6a7b491-6ed9-4906-8d2d-d8913a581b95","Type":"ContainerStarted","Data":"7ad0f7e2157f90444c2351d7d602b57b8453fd94568b10db2d2a0aff8195c597"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.476494 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.476517 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.476527 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eb8be8cc957c26cde7de930eff38893f2ed032cc6ea6bfc68996870c9087da7e"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.478084 4677 generic.go:334] "Generic (PLEG): container finished" podID="67f7f734-b59a-447c-b4a5-5aeb78d3a4dc" containerID="84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d" exitCode=0 Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.478136 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerDied","Data":"84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.478385 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerStarted","Data":"98947326a135f1d2f4e2d7cd8ae621990ec3d3333873e0f0804c74fd6853b58a"} Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.478665 4677 scope.go:117] "RemoveContainer" containerID="8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d" Oct 07 13:07:30 crc kubenswrapper[4677]: E1007 13:07:30.478820 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.483701 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.504644 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.538188 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.574251 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.599465 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.611419 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.625050 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.644421 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.658060 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.672695 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.686684 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.699011 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.723224 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.737825 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.756916 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.780098 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.793898 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.799622 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:30 crc kubenswrapper[4677]: E1007 13:07:30.799891 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:07:32.79987338 +0000 UTC m=+24.285582495 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.808145 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.824275 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.836503 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.853601 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.865499 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.886310 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.900285 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.900335 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:30 crc kubenswrapper[4677]: E1007 13:07:30.900394 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:30 crc kubenswrapper[4677]: E1007 13:07:30.900457 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:32.900445053 +0000 UTC m=+24.386154168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:30 crc kubenswrapper[4677]: E1007 13:07:30.900492 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:30 crc kubenswrapper[4677]: E1007 13:07:30.900555 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:32.900539906 +0000 UTC m=+24.386249041 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.905124 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.924259 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.938458 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.948613 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:30 crc kubenswrapper[4677]: I1007 13:07:30.963112 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.001125 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.001173 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001270 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001283 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001293 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001332 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:33.001319495 +0000 UTC m=+24.487028610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001376 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001386 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001392 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.001418 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:33.001412188 +0000 UTC m=+24.487121303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.302650 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.302650 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.302847 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:31 crc kubenswrapper[4677]: E1007 13:07:31.302967 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.306362 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.307219 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.308309 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.308889 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.309757 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.310337 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.310857 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.482735 4677 generic.go:334] "Generic (PLEG): container finished" podID="67f7f734-b59a-447c-b4a5-5aeb78d3a4dc" containerID="b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0" exitCode=0 Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.482990 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerDied","Data":"b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0"} Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.487314 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0"} Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.487342 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503"} Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.487351 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3"} Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.487361 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925"} Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.487369 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da"} Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.507054 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.515904 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.531339 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.553845 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.572527 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.588029 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.603463 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.614783 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.625954 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.641972 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.656781 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.669575 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.687550 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.706863 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.910817 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8xd94"] Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.911573 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.913978 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.914518 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.915639 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.915985 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.930579 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.951806 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.965483 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.983750 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:31 crc kubenswrapper[4677]: I1007 13:07:31.998894 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.010969 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9nz5\" (UniqueName: \"kubernetes.io/projected/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-kube-api-access-k9nz5\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.011225 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-serviceca\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.011324 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-host\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.013248 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.022619 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.038002 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.050941 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.073419 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.088389 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.101155 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.111927 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9nz5\" (UniqueName: \"kubernetes.io/projected/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-kube-api-access-k9nz5\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.112071 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-serviceca\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.112162 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-host\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.112319 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-host\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.112860 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.114109 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-serviceca\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.139583 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9nz5\" (UniqueName: \"kubernetes.io/projected/f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff-kube-api-access-k9nz5\") pod \"node-ca-8xd94\" (UID: \"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\") " pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.139921 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.163654 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.227416 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8xd94" Oct 07 13:07:32 crc kubenswrapper[4677]: W1007 13:07:32.241005 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c6e9a_e5e3_4296_b2e6_3ba36d1808ff.slice/crio-81f2a3d3d11a5416731a4c121b1019f5f9dc26bb97edeffe2f4c2469cdd5ea09 WatchSource:0}: Error finding container 81f2a3d3d11a5416731a4c121b1019f5f9dc26bb97edeffe2f4c2469cdd5ea09: Status 404 returned error can't find the container with id 81f2a3d3d11a5416731a4c121b1019f5f9dc26bb97edeffe2f4c2469cdd5ea09 Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.302683 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:32 crc kubenswrapper[4677]: E1007 13:07:32.302876 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.500058 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7"} Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.502113 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd"} Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.503707 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8xd94" event={"ID":"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff","Type":"ContainerStarted","Data":"81f2a3d3d11a5416731a4c121b1019f5f9dc26bb97edeffe2f4c2469cdd5ea09"} Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.509594 4677 generic.go:334] "Generic (PLEG): container finished" podID="67f7f734-b59a-447c-b4a5-5aeb78d3a4dc" containerID="c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519" exitCode=0 Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.509681 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerDied","Data":"c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519"} Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.519133 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.535464 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.548206 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.569142 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.585567 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.597436 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.605769 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.620012 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.645577 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.662027 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.682853 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.697528 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.708617 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.720161 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.732493 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.744357 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.756635 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.768844 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.789789 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.805423 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.817209 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:32 crc kubenswrapper[4677]: E1007 13:07:32.817377 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:07:36.81734881 +0000 UTC m=+28.303057945 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.823981 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.837619 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.851423 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.872156 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.892562 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.909752 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.918850 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.918899 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:32 crc kubenswrapper[4677]: E1007 13:07:32.918985 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:32 crc kubenswrapper[4677]: E1007 13:07:32.919028 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:32 crc kubenswrapper[4677]: E1007 13:07:32.919033 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:36.919019716 +0000 UTC m=+28.404728831 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:32 crc kubenswrapper[4677]: E1007 13:07:32.919089 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:36.919077557 +0000 UTC m=+28.404786672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.925862 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.937878 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.962117 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:32 crc kubenswrapper[4677]: I1007 13:07:32.978043 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.019711 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.019759 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.019878 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.019894 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.019903 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.019948 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:37.019935279 +0000 UTC m=+28.505644394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.019989 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.020032 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.020048 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.020123 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:37.020098624 +0000 UTC m=+28.505807819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.302253 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.302331 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.302412 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:33 crc kubenswrapper[4677]: E1007 13:07:33.302524 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.514607 4677 generic.go:334] "Generic (PLEG): container finished" podID="67f7f734-b59a-447c-b4a5-5aeb78d3a4dc" containerID="5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90" exitCode=0 Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.514684 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerDied","Data":"5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90"} Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.515693 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8xd94" event={"ID":"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff","Type":"ContainerStarted","Data":"a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c"} Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.526925 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.537711 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.549285 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.564953 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.575986 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.588780 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.598876 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.607661 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.619578 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.630913 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.644073 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.654554 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.667852 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.688328 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.702529 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.715422 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.733253 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.754135 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.766045 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.781619 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.797795 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.815528 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.831633 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.846083 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.858634 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.870172 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.891034 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.905892 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.920741 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:33 crc kubenswrapper[4677]: I1007 13:07:33.932721 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.302224 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:34 crc kubenswrapper[4677]: E1007 13:07:34.302355 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.526382 4677 generic.go:334] "Generic (PLEG): container finished" podID="67f7f734-b59a-447c-b4a5-5aeb78d3a4dc" containerID="3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b" exitCode=0 Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.526547 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerDied","Data":"3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b"} Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.536938 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca"} Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.544300 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.564751 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.581862 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.607034 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.622211 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.636590 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.649944 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.665023 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.689762 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.711261 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.725875 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.740035 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.755359 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.775302 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:34 crc kubenswrapper[4677]: I1007 13:07:34.791360 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.068634 4677 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.071741 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.071820 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.071849 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.072022 4677 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.082483 4677 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.082774 4677 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.085232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.085283 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.085299 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.085319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.085341 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.107859 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.112682 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.112737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.112752 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.112777 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.112792 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.131136 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.135702 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.135755 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.135767 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.135786 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.135799 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.155472 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.159850 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.159914 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.159930 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.159953 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.159970 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.178645 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.182393 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.182484 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.182506 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.182533 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.182546 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.204613 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.204781 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.206909 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.206942 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.206955 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.206974 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.206990 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.302620 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.302674 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.302860 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:35 crc kubenswrapper[4677]: E1007 13:07:35.303063 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.309584 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.309659 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.309683 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.309713 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.309736 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.412261 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.412323 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.412341 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.412367 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.412385 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.515380 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.515482 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.515498 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.515516 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.515531 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.543625 4677 generic.go:334] "Generic (PLEG): container finished" podID="67f7f734-b59a-447c-b4a5-5aeb78d3a4dc" containerID="6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3" exitCode=0 Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.543667 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerDied","Data":"6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.562138 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.575934 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.592658 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.618220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.618266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.618284 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.618305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.618319 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.622149 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.641351 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.659328 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.679288 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.697909 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.721145 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.721195 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.721211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.721234 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.721251 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.729617 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.746331 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.767199 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.781274 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.797822 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.818775 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.824121 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.824169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.824182 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.824204 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.824219 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.842034 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.927283 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.927321 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.927330 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.927345 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:35 crc kubenswrapper[4677]: I1007 13:07:35.927356 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:35Z","lastTransitionTime":"2025-10-07T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.029770 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.029808 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.029818 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.029832 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.029841 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.132933 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.132989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.132999 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.133019 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.133032 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.236665 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.236830 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.236849 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.236879 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.236897 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.302862 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:36 crc kubenswrapper[4677]: E1007 13:07:36.303093 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.339402 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.339497 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.339513 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.339538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.339552 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.442572 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.442648 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.442671 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.442704 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.442725 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.545039 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.545083 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.545096 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.545114 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.545125 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.549949 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" event={"ID":"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc","Type":"ContainerStarted","Data":"620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.554917 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.555147 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.565234 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.581118 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.583608 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.597583 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.613922 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.627004 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.637600 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.647653 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.647690 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.647703 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.647720 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.647732 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.662338 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.676830 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.691710 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.705736 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.719576 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.742062 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.750272 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.750320 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.750333 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.750352 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.750365 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.757938 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.772995 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.786552 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.803622 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.817276 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.838209 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.851374 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.853111 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.853152 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.853168 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.853188 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.853204 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.861706 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:36 crc kubenswrapper[4677]: E1007 13:07:36.861938 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:07:44.861912957 +0000 UTC m=+36.347622102 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.886684 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.902997 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.915457 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.928562 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.943336 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.955950 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.956037 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.956052 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.956077 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.956091 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:36Z","lastTransitionTime":"2025-10-07T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.960345 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.963352 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.963398 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:36 crc kubenswrapper[4677]: E1007 13:07:36.963490 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:36 crc kubenswrapper[4677]: E1007 13:07:36.963500 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:36 crc kubenswrapper[4677]: E1007 13:07:36.963535 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:44.96352287 +0000 UTC m=+36.449231985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:36 crc kubenswrapper[4677]: E1007 13:07:36.963557 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:44.963541831 +0000 UTC m=+36.449250946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.979960 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:36 crc kubenswrapper[4677]: I1007 13:07:36.992689 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.007693 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.026941 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.041891 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.060369 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.060464 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.060483 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.060509 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.060526 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.064942 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.065026 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065178 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065217 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065263 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065277 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065229 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065371 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065347 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:45.06532377 +0000 UTC m=+36.551032885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.065468 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:07:45.065448744 +0000 UTC m=+36.551157869 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.163675 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.163734 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.163751 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.163777 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.163799 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.266849 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.266931 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.266950 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.266973 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.266990 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.303751 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.303922 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.304365 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:37 crc kubenswrapper[4677]: E1007 13:07:37.304635 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.369837 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.369909 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.369936 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.369976 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.370015 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.472975 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.473048 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.473073 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.473104 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.473123 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.558189 4677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.559916 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.580188 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.580273 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.583591 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.583652 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.583680 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.594988 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.613441 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.640865 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.661821 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.683408 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.686407 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.686525 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.686535 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.686549 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.686559 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.704859 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.718781 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.743944 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.757129 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.772236 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.787203 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.788747 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.788813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.788838 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.788866 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.788883 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.801609 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.813450 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.834236 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.852858 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.877856 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.891803 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.891849 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.891860 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.891876 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.891889 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.994832 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.994872 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.994882 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.994899 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:37 crc kubenswrapper[4677]: I1007 13:07:37.994911 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:37Z","lastTransitionTime":"2025-10-07T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.097785 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.098086 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.098196 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.098299 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.098415 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.200956 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.201194 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.201284 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.201400 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.201514 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.303931 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:38 crc kubenswrapper[4677]: E1007 13:07:38.304120 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.305731 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.305839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.305937 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.306030 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.306113 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.408067 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.408109 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.408121 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.408136 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.408148 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.511507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.511553 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.511564 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.511583 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.511595 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.561500 4677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.614265 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.614302 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.614313 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.614329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.614340 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.719538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.719593 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.719604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.719619 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.719630 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.822278 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.822317 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.822325 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.822339 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.822348 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.925108 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.925387 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.925516 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.925611 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:38 crc kubenswrapper[4677]: I1007 13:07:38.925699 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:38Z","lastTransitionTime":"2025-10-07T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.029418 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.029688 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.029764 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.029861 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.029948 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.132877 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.132960 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.132987 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.133017 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.133039 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.236262 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.236298 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.236308 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.236327 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.236339 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.303158 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.303184 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:39 crc kubenswrapper[4677]: E1007 13:07:39.303378 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:39 crc kubenswrapper[4677]: E1007 13:07:39.303613 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.338908 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.339486 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.339534 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.339557 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.339586 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.339611 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.362018 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.382802 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.405228 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.419033 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.433943 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.442156 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.442225 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.442245 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.442270 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.442287 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.449788 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.466971 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.479053 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.490396 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.513854 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.527880 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.540959 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.544691 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.544718 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.544726 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.544739 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.544750 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.549101 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.561006 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.564971 4677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.648741 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.648794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.648814 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.648839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.648856 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.752508 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.752559 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.752578 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.752604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.752626 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.856777 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.856820 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.856831 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.856850 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.856861 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.959115 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.959152 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.959164 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.959182 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:39 crc kubenswrapper[4677]: I1007 13:07:39.959194 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:39Z","lastTransitionTime":"2025-10-07T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.062727 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.062817 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.062845 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.062880 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.062903 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.165247 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.165297 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.165312 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.165333 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.165346 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.268540 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.268902 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.269055 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.269224 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.269369 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.302179 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:40 crc kubenswrapper[4677]: E1007 13:07:40.302341 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.373236 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.373323 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.373350 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.373384 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.373408 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.477013 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.477076 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.477098 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.477130 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.477154 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.569430 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/0.log" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.572925 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4" exitCode=1 Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.573028 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.573891 4677 scope.go:117] "RemoveContainer" containerID="7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.579185 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.579305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.579406 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.579532 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.579618 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.597023 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.612143 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.633384 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.654256 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.671530 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.682479 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.682504 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.682513 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.682526 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.682535 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.686244 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.695893 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.718057 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.733824 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.745355 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.757619 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.769069 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.785349 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.785555 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.785583 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.785599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.785610 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.793699 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.805964 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.820479 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.889058 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.889124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.889143 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.889168 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.889187 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.992043 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.992082 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.992092 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.992106 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:40 crc kubenswrapper[4677]: I1007 13:07:40.992116 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:40Z","lastTransitionTime":"2025-10-07T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.093906 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.093951 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.093965 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.093983 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.093996 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.195962 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.195999 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.196007 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.196022 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.196033 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.298912 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.298990 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.299014 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.299044 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.299069 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.302061 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.302069 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:41 crc kubenswrapper[4677]: E1007 13:07:41.302207 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:41 crc kubenswrapper[4677]: E1007 13:07:41.302306 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.401967 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.402007 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.402023 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.402040 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.402052 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.504694 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.504758 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.504779 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.504806 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.504826 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.580292 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/0.log" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.584881 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.585114 4677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.601792 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.607149 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.607201 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.607213 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.607231 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.607244 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.621030 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.657823 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.678268 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.691282 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.705570 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.709259 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.709298 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.709309 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.709325 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.709336 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.717282 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.737357 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.753845 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.768633 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.780717 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.791939 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.812180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.812220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.812231 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.812248 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.812260 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.820849 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.835878 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.852741 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.915879 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.915959 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.915985 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.916015 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:41 crc kubenswrapper[4677]: I1007 13:07:41.916036 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:41Z","lastTransitionTime":"2025-10-07T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.021869 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.022183 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.022216 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.022245 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.022259 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.023000 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9"] Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.025282 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.027924 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.028919 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.047436 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.059975 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.076033 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.100041 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.117922 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.118750 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmxj\" (UniqueName: \"kubernetes.io/projected/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-kube-api-access-ftmxj\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.118968 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.119157 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.119391 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.125348 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.125404 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.125424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.125486 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.125504 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.138011 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.155262 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.167321 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.179557 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.192844 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.209675 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.220649 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.221068 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.221177 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.221247 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.221311 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmxj\" (UniqueName: \"kubernetes.io/projected/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-kube-api-access-ftmxj\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.222347 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.222561 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.228090 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.228145 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.228163 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.228187 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.228204 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.229117 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.237395 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.256544 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmxj\" (UniqueName: \"kubernetes.io/projected/ea5a5436-29f6-4edd-9d4d-22eb9dd828c3-kube-api-access-ftmxj\") pod \"ovnkube-control-plane-749d76644c-qf2v9\" (UID: \"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.267942 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.291578 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.302001 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:42 crc kubenswrapper[4677]: E1007 13:07:42.302166 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.303212 4677 scope.go:117] "RemoveContainer" containerID="8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.309477 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.330704 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.330766 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.330785 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.330810 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.330831 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.345110 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.434266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.434313 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.434329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.434348 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.434360 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.537137 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.537192 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.537210 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.537233 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.537251 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.589178 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" event={"ID":"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3","Type":"ContainerStarted","Data":"993f5f541494dfe864070ad7ed17d36b07d341449d0e1ac6f44d26f74faf0c0b"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.591787 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/1.log" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.592743 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/0.log" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.595985 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074" exitCode=1 Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.596031 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.596065 4677 scope.go:117] "RemoveContainer" containerID="7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.597130 4677 scope.go:117] "RemoveContainer" containerID="8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074" Oct 07 13:07:42 crc kubenswrapper[4677]: E1007 13:07:42.597513 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.611749 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.637666 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.640255 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.640289 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.640298 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.640312 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.640323 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.653408 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.666970 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.679270 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.689726 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.708864 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.725522 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.743239 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.743295 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.743312 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.743337 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.743355 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.743822 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.762847 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.799206 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.839696 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.845361 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.845410 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.845419 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.845436 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.845457 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.854220 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.868302 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.881468 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.892875 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.948367 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.948561 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.948737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.948968 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:42 crc kubenswrapper[4677]: I1007 13:07:42.949069 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:42Z","lastTransitionTime":"2025-10-07T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.052365 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.052676 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.052763 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.052856 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.052936 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.155373 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.155412 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.155423 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.155453 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.155463 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.176002 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8bljr"] Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.176763 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:43 crc kubenswrapper[4677]: E1007 13:07:43.176920 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.189970 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.200319 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.212985 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.224447 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.232068 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc97j\" (UniqueName: \"kubernetes.io/projected/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-kube-api-access-rc97j\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.232101 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.235569 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.253222 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.257040 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.257064 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.257076 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.257091 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.257101 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.266800 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.280454 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.291099 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.302696 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.302752 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:43 crc kubenswrapper[4677]: E1007 13:07:43.303000 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:43 crc kubenswrapper[4677]: E1007 13:07:43.303005 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.305182 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.319285 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.331974 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.332643 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.332732 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc97j\" (UniqueName: \"kubernetes.io/projected/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-kube-api-access-rc97j\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:43 crc kubenswrapper[4677]: E1007 13:07:43.332767 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:43 crc kubenswrapper[4677]: E1007 13:07:43.332842 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:07:43.832819969 +0000 UTC m=+35.318529194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.352602 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.353378 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc97j\" (UniqueName: \"kubernetes.io/projected/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-kube-api-access-rc97j\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.359725 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.359761 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.359769 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.359784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.359795 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.384470 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.398498 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.412054 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.422544 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.462462 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.462499 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.462510 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.462524 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.462533 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.565059 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.565105 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.565116 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.565132 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.565143 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.600384 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/1.log" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.607864 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" event={"ID":"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3","Type":"ContainerStarted","Data":"97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.607891 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" event={"ID":"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3","Type":"ContainerStarted","Data":"b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.609556 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.610908 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.611165 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.637349 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.655561 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.667096 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.667205 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.667216 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.667234 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.667245 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.670422 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.683507 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.695581 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.706517 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.716284 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.725890 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.740211 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.752051 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.761803 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.768649 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.768687 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.768699 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.768714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.768723 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.778885 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.790682 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.801557 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.811615 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.819842 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.833933 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.838611 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:43 crc kubenswrapper[4677]: E1007 13:07:43.838751 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:43 crc kubenswrapper[4677]: E1007 13:07:43.838805 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:07:44.838785317 +0000 UTC m=+36.324494432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.846537 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.857685 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.868623 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.871882 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.871933 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.871947 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.871967 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.871985 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.889807 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.905298 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.918352 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.937904 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.954326 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.968763 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.974189 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.974232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.974242 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.974258 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.974268 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:43Z","lastTransitionTime":"2025-10-07T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.980809 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:43 crc kubenswrapper[4677]: I1007 13:07:43.997468 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.009357 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:44Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.019599 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:44Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.029662 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:44Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.039915 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:44Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.054228 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:44Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.063779 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:44Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.076323 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.076354 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.076363 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.076377 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.076387 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.178703 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.178765 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.178786 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.178811 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.178828 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.281163 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.281215 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.281240 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.281269 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.281291 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.302551 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:44 crc kubenswrapper[4677]: E1007 13:07:44.302752 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.383403 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.383506 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.383538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.383568 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.383590 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.486381 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.486479 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.486499 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.486524 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.486543 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.589650 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.589798 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.589823 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.589855 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.589878 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.693184 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.693244 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.693262 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.693286 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.693306 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.796579 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.796647 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.796667 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.796696 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.796717 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.850255 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:44 crc kubenswrapper[4677]: E1007 13:07:44.850496 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:44 crc kubenswrapper[4677]: E1007 13:07:44.850613 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:07:46.850576616 +0000 UTC m=+38.336285771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.899713 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.899784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.899803 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.899826 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.899843 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:44Z","lastTransitionTime":"2025-10-07T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:44 crc kubenswrapper[4677]: I1007 13:07:44.951103 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:07:44 crc kubenswrapper[4677]: E1007 13:07:44.951385 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:08:00.951343565 +0000 UTC m=+52.437052730 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.003511 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.003569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.003619 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.003641 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.003655 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.053030 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.053223 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.053339 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.053356 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:01.05332979 +0000 UTC m=+52.539038925 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.053236 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.053499 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:01.053401173 +0000 UTC m=+52.539110338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.106189 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.106234 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.106243 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.106259 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.106269 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.154423 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154586 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154617 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154634 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.154625 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154689 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:01.154673116 +0000 UTC m=+52.640382241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154849 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154896 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154908 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.154962 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:01.154947235 +0000 UTC m=+52.640656350 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.209348 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.209410 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.209428 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.209492 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.209511 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.225628 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.225757 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.225784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.225819 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.225847 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.247391 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:45Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.252381 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.252464 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.252481 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.252506 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.252524 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.276280 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:45Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.281198 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.281266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.281290 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.281319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.281341 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.302560 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:45Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.302733 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.302771 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.302733 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.302910 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.303060 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.303180 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.308863 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.308923 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.308943 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.308969 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.308992 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.328892 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:45Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.334559 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.334632 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.334650 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.334672 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.334688 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.353831 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:45Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:45 crc kubenswrapper[4677]: E1007 13:07:45.353986 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.356146 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.356195 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.356237 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.356259 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.356276 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.459069 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.459135 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.459152 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.459177 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.459193 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.561866 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.561925 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.561941 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.561966 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.561984 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.666212 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.666266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.666282 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.666303 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.666321 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.769861 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.769922 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.769947 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.769982 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.770006 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.873608 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.873682 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.873705 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.873737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.873759 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.977361 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.977414 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.977425 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.977486 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:45 crc kubenswrapper[4677]: I1007 13:07:45.977499 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:45Z","lastTransitionTime":"2025-10-07T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.080083 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.080134 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.080144 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.080162 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.080173 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.182755 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.182828 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.182864 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.182892 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.182912 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.285897 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.285950 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.285974 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.286004 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.286027 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.303033 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:46 crc kubenswrapper[4677]: E1007 13:07:46.303164 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.388884 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.388955 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.388972 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.388996 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.389014 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.492180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.492220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.492232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.492248 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.492260 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.594863 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.594947 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.594969 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.594999 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.595020 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.698151 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.698225 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.698252 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.698279 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.698296 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.801477 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.802417 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.802655 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.802896 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.803068 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.871843 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:46 crc kubenswrapper[4677]: E1007 13:07:46.872041 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:46 crc kubenswrapper[4677]: E1007 13:07:46.872518 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:07:50.87248922 +0000 UTC m=+42.358198365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.906355 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.906404 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.906420 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.906484 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:46 crc kubenswrapper[4677]: I1007 13:07:46.906502 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:46Z","lastTransitionTime":"2025-10-07T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.009843 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.009992 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.010020 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.010050 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.010072 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.112857 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.112904 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.112915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.112931 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.112942 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.215611 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.215646 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.215655 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.215667 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.215677 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.303535 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.303561 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.303559 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:47 crc kubenswrapper[4677]: E1007 13:07:47.303644 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:47 crc kubenswrapper[4677]: E1007 13:07:47.303770 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:47 crc kubenswrapper[4677]: E1007 13:07:47.303911 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.317786 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.317829 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.317839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.317855 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.317867 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.420585 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.420651 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.420675 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.420704 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.420726 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.523375 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.523418 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.523461 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.523484 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.523497 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.626200 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.626270 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.626292 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.626319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.626339 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.730275 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.730358 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.730380 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.730404 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.730421 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.833477 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.833540 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.833557 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.833583 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.833602 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.936337 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.936390 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.936408 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.936426 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:47 crc kubenswrapper[4677]: I1007 13:07:47.936455 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:47Z","lastTransitionTime":"2025-10-07T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.039771 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.039840 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.039861 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.039890 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.039911 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.143063 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.143139 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.143169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.143198 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.143221 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.246504 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.246813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.247014 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.247216 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.247415 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.302760 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:48 crc kubenswrapper[4677]: E1007 13:07:48.302920 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.350260 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.350343 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.350363 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.350389 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.350408 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.453502 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.453572 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.453596 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.453641 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.453658 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.556979 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.557030 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.557045 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.557064 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.557078 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.662988 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.663032 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.663041 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.663056 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.663072 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.766544 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.766759 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.766822 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.766922 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.766985 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.869629 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.869675 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.869691 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.869714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.869730 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.973213 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.973273 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.973293 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.973319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:48 crc kubenswrapper[4677]: I1007 13:07:48.973339 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:48Z","lastTransitionTime":"2025-10-07T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.075586 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.075676 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.075693 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.075717 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.075733 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.177382 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.177695 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.177782 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.177855 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.177932 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.279680 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.279727 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.279738 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.279756 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.279770 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.303111 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.303111 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:49 crc kubenswrapper[4677]: E1007 13:07:49.303491 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:49 crc kubenswrapper[4677]: E1007 13:07:49.303504 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.303148 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:49 crc kubenswrapper[4677]: E1007 13:07:49.303834 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.320037 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.331288 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.353105 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.369693 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.382428 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.382482 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.382491 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.382505 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.382514 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.385688 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.402197 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.412293 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.423013 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.430987 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.439680 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.449870 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.459302 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.468709 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.484960 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.484996 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.485020 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.485037 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.485045 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.485081 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba5075e656b746f559affe6da7c0989fbbf09769b561bbdd1ddb8e552134a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:39Z\\\",\\\"message\\\":\\\"orkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.213014 5962 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213230 5962 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 13:07:39.213573 5962 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:07:39.213591 5962 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.213763 5962 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:07:39.214112 5962 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:07:39.214258 5962 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.495129 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.505454 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.520700 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.588065 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.588118 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.588135 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.588155 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.588170 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.691049 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.691113 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.691137 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.691199 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.691221 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.794271 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.794328 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.794344 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.794366 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.794383 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.899097 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.899157 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.899172 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.899193 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:49 crc kubenswrapper[4677]: I1007 13:07:49.899207 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:49Z","lastTransitionTime":"2025-10-07T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.003219 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.003279 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.003297 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.003326 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.003345 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.107007 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.107266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.107397 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.107610 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.107762 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.210649 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.210700 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.210722 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.210753 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.210779 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.302119 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:50 crc kubenswrapper[4677]: E1007 13:07:50.302484 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.313576 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.313689 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.313764 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.313831 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.313889 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.417042 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.417087 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.417103 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.417126 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.417143 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.462685 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.464400 4677 scope.go:117] "RemoveContainer" containerID="8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074" Oct 07 13:07:50 crc kubenswrapper[4677]: E1007 13:07:50.464784 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.496097 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.516915 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.519952 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.520211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.520384 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.520627 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.520894 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.537188 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.552987 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.571695 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.585215 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.599003 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.615872 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.623930 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.624086 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.624207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.624511 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.624730 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.633412 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.645808 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.660575 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.680137 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.696193 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.727987 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.728048 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.728072 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.728101 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.728124 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.771257 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.783420 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.796380 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.811116 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.829995 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.830068 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.830088 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.830118 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.830141 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.922089 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:50 crc kubenswrapper[4677]: E1007 13:07:50.922227 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:50 crc kubenswrapper[4677]: E1007 13:07:50.922288 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:07:58.922271141 +0000 UTC m=+50.407980276 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.932859 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.932888 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.932898 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.932911 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:50 crc kubenswrapper[4677]: I1007 13:07:50.932921 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:50Z","lastTransitionTime":"2025-10-07T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.035548 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.035614 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.035634 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.035658 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.035675 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.138729 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.138783 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.138801 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.138826 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.138847 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.241410 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.241483 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.241507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.241530 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.241546 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.302890 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.303009 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:51 crc kubenswrapper[4677]: E1007 13:07:51.303100 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.303177 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:51 crc kubenswrapper[4677]: E1007 13:07:51.303384 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:51 crc kubenswrapper[4677]: E1007 13:07:51.303541 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.344123 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.344181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.344198 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.344226 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.344248 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.447147 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.447201 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.447219 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.447240 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.447251 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.550305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.550360 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.550378 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.550398 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.550413 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.653258 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.653318 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.653338 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.653369 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.653394 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.756471 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.756531 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.756551 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.756574 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.756590 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.859765 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.859836 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.859859 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.859889 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.859913 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.962672 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.962750 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.962774 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.962797 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:51 crc kubenswrapper[4677]: I1007 13:07:51.962814 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:51Z","lastTransitionTime":"2025-10-07T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.066232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.066272 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.066283 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.066299 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.066310 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.169338 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.169395 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.169418 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.169481 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.169507 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.271885 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.271932 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.271948 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.271970 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.271986 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.302992 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:52 crc kubenswrapper[4677]: E1007 13:07:52.303221 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.375515 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.375811 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.376081 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.376337 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.376543 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.480227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.480307 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.480338 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.480367 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.480384 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.583763 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.583815 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.583832 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.583855 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.583874 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.686978 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.687042 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.687060 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.687088 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.687104 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.790391 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.790568 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.790609 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.790635 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.790651 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.892866 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.892909 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.892920 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.892936 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.892949 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.996229 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.996300 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.996318 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.996350 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:52 crc kubenswrapper[4677]: I1007 13:07:52.996366 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:52Z","lastTransitionTime":"2025-10-07T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.099508 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.099569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.099586 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.099610 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.099627 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.202324 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.202385 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.202403 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.202479 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.202495 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.302284 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.302383 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:53 crc kubenswrapper[4677]: E1007 13:07:53.302500 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:53 crc kubenswrapper[4677]: E1007 13:07:53.302631 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.302821 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:53 crc kubenswrapper[4677]: E1007 13:07:53.303175 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.304577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.304609 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.304631 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.304670 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.304687 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.408126 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.408186 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.408206 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.408229 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.408245 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.511374 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.511469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.511499 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.511531 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.511553 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.614733 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.615104 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.615307 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.615560 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.615761 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.718478 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.718525 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.718539 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.718559 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.718575 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.822093 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.822176 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.822187 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.822204 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.822217 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.925927 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.926000 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.926018 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.926042 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:53 crc kubenswrapper[4677]: I1007 13:07:53.926059 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:53Z","lastTransitionTime":"2025-10-07T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.029313 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.029371 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.029389 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.029414 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.029458 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.132074 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.132135 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.132153 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.132174 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.132192 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.235217 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.235306 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.235324 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.235346 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.235364 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.302618 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:54 crc kubenswrapper[4677]: E1007 13:07:54.302805 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.337812 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.337883 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.337906 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.337938 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.337958 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.441368 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.441520 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.441541 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.441565 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.441581 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.544808 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.545147 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.545199 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.545231 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.545255 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.648531 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.648599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.648616 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.648641 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.648662 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.751645 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.751712 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.751728 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.751752 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.751769 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.854373 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.854428 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.854477 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.854503 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.854520 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.957548 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.957598 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.957610 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.957626 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:54 crc kubenswrapper[4677]: I1007 13:07:54.957637 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:54Z","lastTransitionTime":"2025-10-07T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.059852 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.059924 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.059944 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.059976 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.059998 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.163098 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.163180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.163204 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.163235 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.163258 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.266132 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.266191 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.266208 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.266230 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.266247 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.303105 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.303155 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.303237 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.303337 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.303532 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.303757 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.370471 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.370534 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.370551 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.370576 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.370593 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.411937 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.411997 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.412015 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.412041 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.412058 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.438685 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:55Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.444635 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.444689 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.444708 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.444732 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.444750 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.468037 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:55Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.473718 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.473798 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.473817 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.473842 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.473858 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.494652 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:55Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.500126 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.500487 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.500693 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.500871 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.501030 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.523080 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:55Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.530032 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.530088 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.530110 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.530137 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.530153 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.547553 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:55Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:55 crc kubenswrapper[4677]: E1007 13:07:55.547792 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.549925 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.549973 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.549993 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.550017 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.550035 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.653397 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.653484 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.653502 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.653531 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.653547 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.756738 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.756796 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.756819 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.756848 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.756867 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.860306 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.860381 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.860406 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.860473 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.860502 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.964379 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.964473 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.964491 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.964516 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:55 crc kubenswrapper[4677]: I1007 13:07:55.964533 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:55Z","lastTransitionTime":"2025-10-07T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.067300 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.067370 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.067385 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.067405 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.067418 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.170609 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.170684 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.170697 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.170724 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.170749 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.273688 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.273756 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.273773 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.273794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.273805 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.302387 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:56 crc kubenswrapper[4677]: E1007 13:07:56.302534 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.376422 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.376504 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.376523 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.376547 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.376568 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.479726 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.479794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.479807 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.479827 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.479840 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.583338 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.583405 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.583419 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.583469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.583488 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.686619 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.686692 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.686714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.686738 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.686755 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.789776 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.789843 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.789868 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.789898 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.789922 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.893383 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.893481 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.893498 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.893522 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.893542 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.996866 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.996963 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.996987 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.997019 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:56 crc kubenswrapper[4677]: I1007 13:07:56.997043 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:56Z","lastTransitionTime":"2025-10-07T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.100084 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.100133 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.100149 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.100172 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.100188 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.203875 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.203940 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.203962 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.203992 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.204016 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.302131 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.302229 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.302277 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:57 crc kubenswrapper[4677]: E1007 13:07:57.302513 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:57 crc kubenswrapper[4677]: E1007 13:07:57.302816 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:57 crc kubenswrapper[4677]: E1007 13:07:57.302851 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.310675 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.310734 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.310751 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.310777 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.310796 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.414609 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.414688 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.414713 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.414804 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.414828 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.517866 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.517948 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.517970 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.518000 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.518025 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.621062 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.621142 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.621158 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.621177 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.621191 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.724808 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.724882 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.724905 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.724938 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.724964 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.829120 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.829227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.829247 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.829278 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.829295 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.879201 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.901264 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:57Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.916039 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:57Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.931942 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.932124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.932145 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.932171 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.932203 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:57Z","lastTransitionTime":"2025-10-07T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.936346 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:57Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.948106 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:57Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.971334 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:57Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:57 crc kubenswrapper[4677]: I1007 13:07:57.988595 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:57Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.000964 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:57Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.011707 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.023341 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.034265 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.034672 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.034713 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.034726 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.034746 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.034759 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.044921 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.055988 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.076885 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.092406 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.105785 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.127753 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.137043 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.137136 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.137151 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.137209 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.137221 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.144461 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.239345 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.239404 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.239421 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.239539 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.239557 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.302526 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:07:58 crc kubenswrapper[4677]: E1007 13:07:58.302650 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.342887 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.342944 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.342960 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.342984 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.343000 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.445864 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.445928 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.445957 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.445982 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.445999 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.548940 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.549024 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.549045 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.549078 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.549101 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.651531 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.651576 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.651587 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.651604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.651616 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.755180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.755231 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.755247 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.755270 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.755288 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.858820 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.858922 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.858956 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.858987 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.859008 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.962719 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.962786 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.962804 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.962828 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:58 crc kubenswrapper[4677]: I1007 13:07:58.962847 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:58Z","lastTransitionTime":"2025-10-07T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.021818 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:59 crc kubenswrapper[4677]: E1007 13:07:59.022048 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:59 crc kubenswrapper[4677]: E1007 13:07:59.022165 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:08:15.022136354 +0000 UTC m=+66.507845509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.065955 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.066033 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.066057 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.066081 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.066099 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.168797 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.168908 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.168929 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.169141 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.169163 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.273094 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.273152 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.273168 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.273191 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.273207 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.303287 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.303471 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:07:59 crc kubenswrapper[4677]: E1007 13:07:59.303675 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.304014 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:07:59 crc kubenswrapper[4677]: E1007 13:07:59.304158 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:07:59 crc kubenswrapper[4677]: E1007 13:07:59.304307 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.317641 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.336830 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.355832 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.376037 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.376072 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.376083 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.376103 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.376213 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.383181 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.399364 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.411670 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.423419 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.441707 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.452921 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.466554 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.479227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.479274 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.479286 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.479306 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.479319 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.485250 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.501320 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.516478 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.528321 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.547836 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.563900 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.582110 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.582172 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.582213 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.582238 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.582251 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.583499 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:07:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.686205 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.686281 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.686294 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.686311 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.686346 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.789717 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.789778 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.789794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.789872 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.789894 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.927651 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.927687 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.927695 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.927708 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:07:59 crc kubenswrapper[4677]: I1007 13:07:59.927717 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:07:59Z","lastTransitionTime":"2025-10-07T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.030202 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.030264 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.030286 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.030311 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.030328 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.133103 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.133164 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.133181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.133205 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.133223 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.237067 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.237161 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.237184 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.237212 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.237231 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.302767 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:00 crc kubenswrapper[4677]: E1007 13:08:00.302981 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.340194 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.340255 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.340272 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.340328 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.340349 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.443759 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.444121 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.444141 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.444166 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.444184 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.546576 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.546670 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.546690 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.546714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.546733 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.649894 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.649936 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.649950 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.649971 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.649986 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.752686 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.753006 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.753160 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.753305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.753547 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.856825 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.857117 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.857230 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.857350 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:00 crc kubenswrapper[4677]: I1007 13:08:00.857483 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:00Z","lastTransitionTime":"2025-10-07T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.001915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.001954 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.001964 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.001981 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.001995 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.042041 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.042282 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:08:33.042234142 +0000 UTC m=+84.527943297 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.104732 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.104794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.104810 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.104835 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.104852 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.143166 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.143288 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.143399 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.143557 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:33.143524756 +0000 UTC m=+84.629233911 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.143551 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.143653 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:33.14363599 +0000 UTC m=+84.629345335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.209831 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.210168 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.210296 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.210503 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.210666 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.245120 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.245294 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.245598 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.245636 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.245662 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.245769 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:33.245738168 +0000 UTC m=+84.731447333 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.246106 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.246171 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.246198 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.246296 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:08:33.246263924 +0000 UTC m=+84.731973089 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.302166 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.302229 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.302303 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.303002 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.303140 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:01 crc kubenswrapper[4677]: E1007 13:08:01.303336 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.314681 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.314745 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.314764 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.314787 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.314805 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.417413 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.417793 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.418259 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.418670 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.418859 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.521938 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.521981 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.521999 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.522020 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.522035 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.625103 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.625167 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.625185 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.625208 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.625228 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.728088 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.728153 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.728170 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.728197 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.728220 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.831578 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.832126 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.832275 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.832415 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.832581 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.935980 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.936054 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.936078 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.936111 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:01 crc kubenswrapper[4677]: I1007 13:08:01.936140 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:01Z","lastTransitionTime":"2025-10-07T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.038284 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.038352 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.038367 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.038391 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.038405 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.141161 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.141199 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.141208 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.141222 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.141230 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.244283 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.244355 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.244373 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.244400 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.244417 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.303038 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:02 crc kubenswrapper[4677]: E1007 13:08:02.303157 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.347815 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.347878 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.347901 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.347929 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.347951 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.451082 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.451139 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.451155 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.451180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.451200 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.554475 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.554553 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.554577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.554601 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.554617 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.657172 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.657248 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.657277 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.657362 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.657382 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.760726 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.760820 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.760837 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.760861 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.760878 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.863749 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.863805 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.863821 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.863842 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.863859 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.966059 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.966106 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.966136 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.966165 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:02 crc kubenswrapper[4677]: I1007 13:08:02.966190 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:02Z","lastTransitionTime":"2025-10-07T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.069630 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.069742 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.069769 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.069804 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.069841 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.172386 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.172451 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.172466 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.172481 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.172491 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.275043 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.275102 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.275114 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.275129 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.275138 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.302979 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.303024 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.303083 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:03 crc kubenswrapper[4677]: E1007 13:08:03.303197 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:03 crc kubenswrapper[4677]: E1007 13:08:03.303342 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:03 crc kubenswrapper[4677]: E1007 13:08:03.303401 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.378515 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.378564 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.378574 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.378590 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.378600 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.481242 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.481293 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.481307 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.481327 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.481342 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.584327 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.584366 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.584376 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.584391 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.584400 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.686884 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.686982 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.687014 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.687047 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.687073 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.791514 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.791578 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.791595 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.791617 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.791632 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.894566 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.894611 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.894624 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.894645 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.894659 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.996998 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.997053 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.997069 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.997091 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:03 crc kubenswrapper[4677]: I1007 13:08:03.997108 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:03Z","lastTransitionTime":"2025-10-07T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.099668 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.099764 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.099788 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.099818 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.099843 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.203201 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.203272 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.203296 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.203331 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.203355 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.302790 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:04 crc kubenswrapper[4677]: E1007 13:08:04.302933 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.305531 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.305573 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.305586 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.305601 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.305613 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.407900 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.408044 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.408124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.408153 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.408188 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.510506 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.510554 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.510577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.510598 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.510611 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.614189 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.614314 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.614333 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.614355 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.614372 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.717605 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.717688 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.717712 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.717740 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.717764 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.821226 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.821289 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.821307 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.821364 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.821382 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.924244 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.924337 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.924356 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.924379 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:04 crc kubenswrapper[4677]: I1007 13:08:04.924396 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:04Z","lastTransitionTime":"2025-10-07T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.027320 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.027375 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.027392 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.027412 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.027449 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.129549 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.129626 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.129650 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.129679 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.129705 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.145320 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.158228 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.174290 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.195801 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.215903 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.232577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.232613 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.232624 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.232639 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.232650 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.235426 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.251030 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.269104 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.285813 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.302075 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.302207 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.302318 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.302384 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.302037 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.302576 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.303027 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.303366 4677 scope.go:117] "RemoveContainer" containerID="8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.320697 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.335605 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.335634 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.335645 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.335660 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.335671 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.339267 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.363781 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.384347 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.397427 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.412965 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.431338 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.438539 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.438607 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.438634 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.438667 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.438691 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.448189 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.475201 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.541621 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.541658 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.541670 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.541684 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.541696 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.645031 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.645091 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.645107 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.645131 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.645147 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.669614 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.669680 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.669697 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.669725 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.669742 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.698376 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.702265 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/1.log" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.706039 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.706099 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.706116 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.706138 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.706154 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.707546 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.708255 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.729265 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.733079 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.737667 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.737709 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.737726 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.737749 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.737767 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.751062 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.779369 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.786166 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.789562 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.789658 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.789672 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.789688 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.789701 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.803893 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.808179 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.808257 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.808274 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.808294 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.808308 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.816008 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.831478 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: E1007 13:08:05.831778 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.834369 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.834481 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.834511 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.834543 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.834564 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.840968 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.861920 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.882525 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.905551 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.937450 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.937492 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.937504 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.937521 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.937534 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:05Z","lastTransitionTime":"2025-10-07T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.944224 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.969787 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.980860 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:05 crc kubenswrapper[4677]: I1007 13:08:05.990920 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.003293 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.017050 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.026975 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.040178 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.040228 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.040239 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.040255 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.040269 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.044970 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.057866 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.071835 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.142601 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.142667 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.142684 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.142711 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.142728 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.245503 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.245544 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.245555 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.245569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.245580 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.302754 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:06 crc kubenswrapper[4677]: E1007 13:08:06.302944 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.348648 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.348694 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.348707 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.348730 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.348746 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.452085 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.452158 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.452176 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.452265 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.452289 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.555708 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.555788 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.555811 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.555904 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.555928 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.659703 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.659756 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.659774 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.659797 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.659813 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.714111 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/2.log" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.715154 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/1.log" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.718978 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b" exitCode=1 Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.719042 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.719115 4677 scope.go:117] "RemoveContainer" containerID="8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.720482 4677 scope.go:117] "RemoveContainer" containerID="d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b" Oct 07 13:08:06 crc kubenswrapper[4677]: E1007 13:08:06.720800 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.744894 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.763952 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.763993 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.764001 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.764015 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.764025 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.767273 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.789264 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.822872 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fba8b3547cb6e55cb13bf4889eabf3bf5e731d44475c488c659fe844b985074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:07:41Z\\\",\\\"message\\\":\\\"mage-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452562 6109 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:07:41.452766 6109 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1007 13:07:41.452679 6109 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.845288 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.861872 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.867634 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.867678 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.867695 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.867719 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.867736 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.880609 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.899683 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.919329 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.937102 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.961497 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.970953 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.971035 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.971065 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.971100 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:06 crc kubenswrapper[4677]: I1007 13:08:06.971125 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:06Z","lastTransitionTime":"2025-10-07T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.004329 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.027804 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.048754 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.064602 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.073815 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.073846 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.073854 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.073895 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.073905 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.075869 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.091865 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.109770 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.177948 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.178007 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.178024 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.178048 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.178066 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.282109 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.282173 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.282197 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.282227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.282249 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.302127 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:07 crc kubenswrapper[4677]: E1007 13:08:07.302287 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.302654 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:07 crc kubenswrapper[4677]: E1007 13:08:07.302820 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.302660 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:07 crc kubenswrapper[4677]: E1007 13:08:07.303003 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.385885 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.385943 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.385961 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.385984 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.386001 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.489357 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.489416 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.489472 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.489537 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.489557 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.593033 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.593086 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.593104 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.593127 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.593143 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.700900 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.700981 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.701003 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.701033 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.701056 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.725494 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/2.log" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.731723 4677 scope.go:117] "RemoveContainer" containerID="d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b" Oct 07 13:08:07 crc kubenswrapper[4677]: E1007 13:08:07.731966 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.749554 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.780590 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.802118 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.804058 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.804124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.804144 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.804168 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.804186 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.821973 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.839066 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.859097 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.881507 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.897942 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.907220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.907269 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.907286 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.907305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.907319 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:07Z","lastTransitionTime":"2025-10-07T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.923885 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.941401 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.957620 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.974340 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:07 crc kubenswrapper[4677]: I1007 13:08:07.988588 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.002010 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.012363 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.012403 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.012422 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.012458 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.012473 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.031588 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.055952 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.072971 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.088625 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.115932 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.115989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.116006 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.116031 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.116048 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.219661 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.219722 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.219740 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.219764 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.219784 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.302730 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:08 crc kubenswrapper[4677]: E1007 13:08:08.302915 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.323643 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.323713 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.323799 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.323832 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.323853 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.426831 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.426914 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.426933 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.427482 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.427535 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.530349 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.530494 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.530676 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.530714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.530732 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.633181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.633221 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.633232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.633248 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.633258 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.735359 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.735474 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.735503 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.735528 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.735546 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.838290 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.838357 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.838375 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.838400 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.838416 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.941139 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.941197 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.941220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.941251 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:08 crc kubenswrapper[4677]: I1007 13:08:08.941276 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:08Z","lastTransitionTime":"2025-10-07T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.043840 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.043907 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.043930 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.043961 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.043981 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.147648 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.147724 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.147746 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.147769 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.147786 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.249963 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.250054 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.250072 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.250093 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.250107 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.302085 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.302177 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.302185 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:09 crc kubenswrapper[4677]: E1007 13:08:09.302348 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:09 crc kubenswrapper[4677]: E1007 13:08:09.302573 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:09 crc kubenswrapper[4677]: E1007 13:08:09.302733 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.324744 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.345356 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.358560 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.358621 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.358637 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.358655 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.358666 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.379056 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.401793 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.418387 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.431471 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.448819 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.461131 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.461178 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.461193 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.461213 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.461227 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.462772 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.479095 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.495765 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.511133 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.525203 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.535872 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.563966 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.564489 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.564645 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.564758 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.564883 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.565015 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.580741 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.592081 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.613623 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.626725 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.667740 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.667784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.667798 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.667820 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.667836 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.770630 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.770672 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.770681 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.770695 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.770704 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.873976 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.874038 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.874061 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.874086 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.874108 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.976708 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.976775 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.976800 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.976839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:09 crc kubenswrapper[4677]: I1007 13:08:09.976862 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:09Z","lastTransitionTime":"2025-10-07T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.079274 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.079330 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.079347 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.079370 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.079388 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.182069 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.182145 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.182163 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.182215 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.182236 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.285015 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.285085 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.285107 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.285137 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.285158 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.302374 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:10 crc kubenswrapper[4677]: E1007 13:08:10.302562 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.388250 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.388329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.388348 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.388374 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.388392 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.492039 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.492115 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.492136 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.492160 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.492182 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.595714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.595774 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.595785 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.595805 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.595816 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.698527 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.698590 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.698610 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.698633 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.698650 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.801776 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.801871 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.801895 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.801926 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.801947 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.905534 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.905597 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.905620 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.905650 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:10 crc kubenswrapper[4677]: I1007 13:08:10.905673 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:10Z","lastTransitionTime":"2025-10-07T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.009063 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.009154 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.009180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.009212 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.009234 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.112185 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.112263 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.112285 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.112315 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.112339 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.214807 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.214854 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.214867 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.214886 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.214897 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.302891 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.302966 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:11 crc kubenswrapper[4677]: E1007 13:08:11.303100 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.303147 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:11 crc kubenswrapper[4677]: E1007 13:08:11.303296 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:11 crc kubenswrapper[4677]: E1007 13:08:11.303472 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.317732 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.317794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.317813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.317838 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.317855 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.420927 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.420986 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.421002 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.421025 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.421042 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.525080 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.525145 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.525167 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.525200 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.525223 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.628281 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.628329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.628342 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.628359 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.628371 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.731332 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.731402 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.731425 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.731490 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.731514 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.834169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.834255 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.834311 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.834343 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.834365 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.937194 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.937238 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.937250 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.937266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:11 crc kubenswrapper[4677]: I1007 13:08:11.937277 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:11Z","lastTransitionTime":"2025-10-07T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.039841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.039944 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.040002 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.040035 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.040099 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.143390 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.143477 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.143495 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.143520 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.143620 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.247177 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.247233 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.247251 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.247273 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.247291 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.302408 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:12 crc kubenswrapper[4677]: E1007 13:08:12.302618 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.350111 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.350174 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.350191 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.350222 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.350247 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.452473 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.452518 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.452533 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.452552 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.452567 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.554784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.554831 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.554847 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.554869 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.554882 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.658157 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.658221 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.658239 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.658263 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.658279 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.764348 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.765942 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.765989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.766018 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.766039 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.868632 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.868689 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.868703 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.868720 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.868732 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.971873 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.971938 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.971956 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.971980 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:12 crc kubenswrapper[4677]: I1007 13:08:12.972000 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:12Z","lastTransitionTime":"2025-10-07T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.074943 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.074995 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.075013 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.075038 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.075063 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.177977 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.178023 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.178040 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.178059 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.178075 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.280881 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.280915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.280925 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.280941 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.280951 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.302650 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.302650 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:13 crc kubenswrapper[4677]: E1007 13:08:13.302754 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.302903 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:13 crc kubenswrapper[4677]: E1007 13:08:13.303032 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:13 crc kubenswrapper[4677]: E1007 13:08:13.303214 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.383647 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.383688 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.383696 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.383711 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.383719 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.486054 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.486099 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.486110 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.486127 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.486140 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.588648 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.588706 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.588725 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.588750 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.588769 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.691691 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.691773 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.691791 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.691825 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.691843 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.795683 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.795734 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.795747 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.795768 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.795783 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.898613 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.898662 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.898677 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.898700 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:13 crc kubenswrapper[4677]: I1007 13:08:13.898715 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:13Z","lastTransitionTime":"2025-10-07T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.000904 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.000932 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.000940 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.000952 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.000961 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.103765 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.103824 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.103836 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.103859 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.103872 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.206658 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.206727 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.206740 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.206765 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.206780 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.302788 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:14 crc kubenswrapper[4677]: E1007 13:08:14.302978 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.310160 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.310215 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.310234 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.310255 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.310272 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.412593 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.412660 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.412683 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.412712 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.412731 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.514937 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.514978 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.514989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.515003 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.515013 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.617069 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.617111 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.617122 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.617134 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.617143 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.719455 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.719514 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.719535 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.719562 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.719581 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.821786 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.821827 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.821841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.821856 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.821867 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.925542 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.925621 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.925645 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.925677 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:14 crc kubenswrapper[4677]: I1007 13:08:14.925700 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:14Z","lastTransitionTime":"2025-10-07T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.028035 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.028132 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.028156 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.028185 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.028206 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.110541 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:15 crc kubenswrapper[4677]: E1007 13:08:15.110739 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:08:15 crc kubenswrapper[4677]: E1007 13:08:15.110827 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:08:47.110798635 +0000 UTC m=+98.596507790 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.130965 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.131012 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.131034 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.131056 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.131073 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.233657 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.233719 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.233737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.233764 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.233781 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.302918 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.302941 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.302972 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:15 crc kubenswrapper[4677]: E1007 13:08:15.303025 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:15 crc kubenswrapper[4677]: E1007 13:08:15.303212 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:15 crc kubenswrapper[4677]: E1007 13:08:15.303310 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.336046 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.336084 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.336095 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.336110 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.336121 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.438566 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.438615 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.438627 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.438647 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.438660 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.541302 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.541341 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.541351 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.541366 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.541375 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.643752 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.644669 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.644871 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.645033 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.645207 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.747912 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.748240 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.748319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.748413 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.748522 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.850979 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.851020 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.851028 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.851043 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.851052 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.953760 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.954142 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.954305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.954932 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:15 crc kubenswrapper[4677]: I1007 13:08:15.955095 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:15Z","lastTransitionTime":"2025-10-07T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.057171 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.057201 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.057209 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.057221 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.057229 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.160165 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.160211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.160223 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.160237 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.160247 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.204664 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.204722 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.204739 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.204762 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.204779 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: E1007 13:08:16.223482 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.228830 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.228894 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.228916 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.228945 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.228969 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: E1007 13:08:16.247073 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.252013 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.252085 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.252108 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.252140 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.252165 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: E1007 13:08:16.266569 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.270497 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.270571 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.270597 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.270630 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.270657 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: E1007 13:08:16.291025 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.296013 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.296236 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.296267 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.296292 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.296311 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.302018 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:16 crc kubenswrapper[4677]: E1007 13:08:16.302180 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:16 crc kubenswrapper[4677]: E1007 13:08:16.315513 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:16 crc kubenswrapper[4677]: E1007 13:08:16.315662 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.317586 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.317614 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.317625 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.317641 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.317652 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.420424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.420507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.420526 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.420550 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.420566 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.523080 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.523124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.523133 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.523149 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.523158 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.625841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.625894 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.625911 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.625932 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.625950 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.728043 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.728113 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.728132 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.728156 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.728175 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.830081 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.830133 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.830150 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.830173 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.830190 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.938314 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.938404 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.938457 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.938485 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:16 crc kubenswrapper[4677]: I1007 13:08:16.938512 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:16Z","lastTransitionTime":"2025-10-07T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.041162 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.041207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.041217 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.041229 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.041237 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.144010 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.144033 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.144041 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.144052 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.144060 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.245962 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.245991 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.246015 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.246035 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.246051 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.302851 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.302901 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.302914 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:17 crc kubenswrapper[4677]: E1007 13:08:17.303127 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:17 crc kubenswrapper[4677]: E1007 13:08:17.303217 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:17 crc kubenswrapper[4677]: E1007 13:08:17.303320 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.347871 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.347902 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.347912 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.347925 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.347934 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.450728 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.450760 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.450769 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.450780 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.450789 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.553887 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.553933 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.553950 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.553971 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.553988 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.656849 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.656914 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.656932 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.656957 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.656973 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.760274 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.760331 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.760355 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.760385 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.760407 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.764320 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/0.log" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.764372 4677 generic.go:334] "Generic (PLEG): container finished" podID="73bebfb3-50b5-48b6-b348-1d1feb6202d2" containerID="6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546" exitCode=1 Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.764406 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerDied","Data":"6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.764824 4677 scope.go:117] "RemoveContainer" containerID="6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.785705 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.801923 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.814263 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.832965 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.847744 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.860455 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.861827 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.861887 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.861905 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.861938 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.861956 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.883878 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.897006 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.911479 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.921911 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.935123 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.948209 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.959258 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.963910 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.963938 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.963947 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.963959 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.963969 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:17Z","lastTransitionTime":"2025-10-07T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.977301 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:17 crc kubenswrapper[4677]: I1007 13:08:17.991336 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.003647 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.015471 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.024863 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.066241 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.066308 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.066320 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.066336 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.066367 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.168919 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.168961 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.168973 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.168989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.169001 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.271130 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.271188 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.271202 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.271219 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.271229 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.302958 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:18 crc kubenswrapper[4677]: E1007 13:08:18.303095 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.374095 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.374147 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.374169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.374189 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.374200 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.477192 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.477242 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.477253 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.477268 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.477277 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.580014 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.580052 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.580060 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.580073 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.580082 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.682585 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.682659 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.682684 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.682714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.682737 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.772833 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/0.log" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.772925 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerStarted","Data":"fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.785794 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.785854 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.785871 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.785895 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.785913 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.788618 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.803965 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.821059 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.833391 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.845039 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.870119 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.885588 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.888243 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.888278 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.888288 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.888304 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.888315 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.898198 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.908543 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.921923 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.938635 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.949947 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.969418 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.991016 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.991262 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.991295 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.991306 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.991321 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:18 crc kubenswrapper[4677]: I1007 13:08:18.991331 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:18Z","lastTransitionTime":"2025-10-07T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.005887 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.017316 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.028453 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.037314 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.093757 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.093809 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.093821 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.093836 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.093845 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.196677 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.196730 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.196747 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.196769 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.196786 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.300119 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.300176 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.300192 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.300215 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.300232 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.302503 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.302555 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:19 crc kubenswrapper[4677]: E1007 13:08:19.302757 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.302774 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:19 crc kubenswrapper[4677]: E1007 13:08:19.302883 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:19 crc kubenswrapper[4677]: E1007 13:08:19.303040 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.304108 4677 scope.go:117] "RemoveContainer" containerID="d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b" Oct 07 13:08:19 crc kubenswrapper[4677]: E1007 13:08:19.304414 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.316102 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.327173 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.340702 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.353290 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.369048 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.383358 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.396789 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.406078 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.406115 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.406124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.406139 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.406148 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.424619 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.437155 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.450887 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.464721 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.478384 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.492487 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.504675 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.507721 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.507761 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.507775 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.507795 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.507805 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.525395 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.545541 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.561346 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.576474 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.610591 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.610642 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.610653 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.610669 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.610680 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.713402 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.713484 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.713502 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.713548 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.713566 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.815297 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.815343 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.815356 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.815372 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.815382 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.917925 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.917972 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.917986 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.918007 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:19 crc kubenswrapper[4677]: I1007 13:08:19.918020 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:19Z","lastTransitionTime":"2025-10-07T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.020360 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.020409 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.020422 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.020471 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.020489 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.123014 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.123090 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.123113 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.123144 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.123165 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.226342 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.226403 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.226421 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.226492 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.226526 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.302675 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:20 crc kubenswrapper[4677]: E1007 13:08:20.302889 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.328279 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.328319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.328328 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.328344 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.328363 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.430309 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.430346 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.430354 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.430369 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.430379 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.533163 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.533223 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.533231 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.533599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.533627 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.636060 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.636111 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.636122 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.636137 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.636147 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.738525 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.738569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.738578 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.738590 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.738599 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.841270 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.841515 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.841561 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.841592 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.841616 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.943731 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.943762 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.943775 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.943813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:20 crc kubenswrapper[4677]: I1007 13:08:20.943825 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:20Z","lastTransitionTime":"2025-10-07T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.046095 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.046142 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.046155 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.046171 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.046182 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.148797 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.148867 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.148891 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.148919 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.148940 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.251597 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.251665 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.251691 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.251721 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.251742 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.302322 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:21 crc kubenswrapper[4677]: E1007 13:08:21.302575 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.302638 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.302674 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:21 crc kubenswrapper[4677]: E1007 13:08:21.303367 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:21 crc kubenswrapper[4677]: E1007 13:08:21.303601 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.315414 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.354411 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.354479 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.354493 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.354507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.354518 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.456766 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.456812 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.456821 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.456840 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.456849 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.560015 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.560064 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.560074 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.560089 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.560098 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.662936 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.662980 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.662990 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.663004 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.663014 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.767678 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.767736 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.767753 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.767776 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.767792 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.871056 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.871119 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.871137 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.871161 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.871181 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.973940 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.973989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.974004 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.974022 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:21 crc kubenswrapper[4677]: I1007 13:08:21.974034 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:21Z","lastTransitionTime":"2025-10-07T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.076606 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.076656 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.076675 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.076700 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.076724 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.178351 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.178396 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.178408 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.178424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.178459 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.280743 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.280779 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.280788 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.280801 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.280810 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.302046 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:22 crc kubenswrapper[4677]: E1007 13:08:22.302166 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.383342 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.383379 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.383388 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.383401 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.383411 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.486038 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.486101 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.486119 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.486144 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.486163 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.587876 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.587951 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.587964 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.587981 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.587994 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.690315 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.690350 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.690358 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.690374 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.690384 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.792500 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.792569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.792589 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.792613 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.792632 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.900400 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.900502 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.900531 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.900558 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:22 crc kubenswrapper[4677]: I1007 13:08:22.900576 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:22Z","lastTransitionTime":"2025-10-07T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.003652 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.003705 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.003717 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.003735 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.003748 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.107146 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.107207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.107221 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.107237 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.107248 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.209507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.209569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.209581 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.209599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.209612 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.302496 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.302583 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:23 crc kubenswrapper[4677]: E1007 13:08:23.302706 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.302810 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:23 crc kubenswrapper[4677]: E1007 13:08:23.302840 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:23 crc kubenswrapper[4677]: E1007 13:08:23.303001 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.311781 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.311837 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.311856 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.311878 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.311897 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.414839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.414890 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.414902 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.414920 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.414933 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.517875 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.517944 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.517968 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.518000 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.518027 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.619685 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.619717 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.619726 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.619739 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.619748 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.722310 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.722388 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.722414 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.722478 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.722502 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.825604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.825685 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.825711 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.825744 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.825768 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.928666 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.928725 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.928744 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.928768 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:23 crc kubenswrapper[4677]: I1007 13:08:23.928786 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:23Z","lastTransitionTime":"2025-10-07T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.031347 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.031393 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.031406 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.031424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.031462 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.133627 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.133690 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.133709 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.133737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.133755 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.237504 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.237844 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.237869 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.237894 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.237911 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.302640 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:24 crc kubenswrapper[4677]: E1007 13:08:24.302827 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.340923 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.341027 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.341085 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.341111 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.341162 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.444243 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.444272 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.444279 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.444291 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.444300 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.547354 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.547414 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.547497 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.547538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.547559 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.650297 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.650349 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.650362 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.650380 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.650396 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.752893 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.752935 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.752947 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.752962 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.752974 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.855737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.855797 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.855816 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.855840 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.855857 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.959825 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.959871 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.959890 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.959914 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:24 crc kubenswrapper[4677]: I1007 13:08:24.959934 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:24Z","lastTransitionTime":"2025-10-07T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.063106 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.063159 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.063180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.063207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.063228 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.165988 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.166044 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.166061 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.166083 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.166099 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.268913 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.268964 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.268980 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.269004 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.269022 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.302394 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.302486 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.302519 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:25 crc kubenswrapper[4677]: E1007 13:08:25.302610 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:25 crc kubenswrapper[4677]: E1007 13:08:25.302775 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:25 crc kubenswrapper[4677]: E1007 13:08:25.302883 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.371391 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.371488 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.371512 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.371541 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.371563 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.474849 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.474906 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.474929 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.474960 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.474982 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.578583 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.578649 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.578673 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.578701 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.578720 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.681508 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.681561 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.681578 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.681599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.681615 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.785041 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.785116 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.785140 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.785167 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.785187 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.888268 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.888346 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.888369 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.888393 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.888409 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.991348 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.991388 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.991401 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.991416 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:25 crc kubenswrapper[4677]: I1007 13:08:25.991427 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:25Z","lastTransitionTime":"2025-10-07T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.094207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.094327 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.094345 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.094368 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.094385 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.196942 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.197010 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.197032 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.197059 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.197080 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.300076 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.300130 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.300148 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.300169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.300182 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.302669 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:26 crc kubenswrapper[4677]: E1007 13:08:26.302861 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.403676 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.403749 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.403775 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.403805 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.403828 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.507165 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.507228 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.507245 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.507268 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.507287 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.511167 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.511223 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.511244 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.511263 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.511278 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: E1007 13:08:26.534328 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.539825 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.539875 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.539891 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.539916 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.539934 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: E1007 13:08:26.562275 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.568043 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.568082 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.568093 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.568109 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.568123 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: E1007 13:08:26.588016 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.593051 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.593119 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.593144 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.593170 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.593190 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: E1007 13:08:26.614029 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.621213 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.621258 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.621270 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.621285 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.621296 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: E1007 13:08:26.640224 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:26 crc kubenswrapper[4677]: E1007 13:08:26.640496 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.642487 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.642538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.642558 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.642576 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.642591 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.745573 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.745631 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.745648 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.745670 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.745687 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.848780 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.848857 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.848881 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.848908 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.848930 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.952307 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.952384 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.952406 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.952463 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:26 crc kubenswrapper[4677]: I1007 13:08:26.952484 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:26Z","lastTransitionTime":"2025-10-07T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.055206 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.055276 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.055296 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.055321 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.055340 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.158376 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.158535 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.158556 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.158582 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.158602 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.261860 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.261946 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.261974 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.262004 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.262026 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.302868 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.302912 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:27 crc kubenswrapper[4677]: E1007 13:08:27.303072 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.303169 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:27 crc kubenswrapper[4677]: E1007 13:08:27.303365 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:27 crc kubenswrapper[4677]: E1007 13:08:27.303586 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.364604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.364691 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.364715 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.364800 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.364920 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.468193 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.468290 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.468316 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.468344 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.468362 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.571599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.571643 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.571655 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.571704 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.571715 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.675197 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.675244 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.675259 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.675278 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.675293 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.778073 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.778183 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.778211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.778242 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.778264 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.881382 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.881469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.881490 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.881513 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.881529 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.984607 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.984685 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.984717 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.984745 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:27 crc kubenswrapper[4677]: I1007 13:08:27.984765 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:27Z","lastTransitionTime":"2025-10-07T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.087174 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.087224 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.087239 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.087257 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.087270 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.190813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.190876 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.190893 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.190914 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.190930 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.294198 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.294271 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.294296 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.294329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.294352 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.302784 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:28 crc kubenswrapper[4677]: E1007 13:08:28.302950 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.397664 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.397725 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.397744 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.397766 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.397782 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.500900 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.500982 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.501001 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.501031 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.501052 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.604260 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.604334 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.604353 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.604378 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.604396 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.706901 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.706956 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.706969 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.706987 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.706999 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.810838 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.810902 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.810924 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.810949 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.810967 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.914016 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.914066 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.914078 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.914096 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:28 crc kubenswrapper[4677]: I1007 13:08:28.914112 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:28Z","lastTransitionTime":"2025-10-07T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.016829 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.016884 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.016898 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.016918 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.016932 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.119166 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.119207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.119216 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.119421 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.119454 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.222939 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.222989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.223001 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.223020 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.223033 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.310721 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.310798 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:29 crc kubenswrapper[4677]: E1007 13:08:29.310906 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.310810 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:29 crc kubenswrapper[4677]: E1007 13:08:29.311036 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:29 crc kubenswrapper[4677]: E1007 13:08:29.311164 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.326084 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.326355 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.326375 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.326396 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.326414 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.330247 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.362757 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.391045 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.406682 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.422471 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.428480 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.428553 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.428571 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.428593 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.428639 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.439002 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.455292 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.468988 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.486760 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.498682 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.511084 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.521703 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfed5e4c-bba3-4aab-86f7-27b722b12d83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373e947054a32f5e1ecb5b66d2a5e668a14a1c76b2329cc4a60ddee65c80a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.530928 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.530957 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.530975 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.530991 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.531003 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.534520 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.548383 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.560007 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.583992 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.598355 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.611687 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.622938 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.634182 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.634232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.634244 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.634263 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.634276 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.738092 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.738157 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.738173 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.738198 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.738212 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.840841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.840900 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.840915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.840935 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.840947 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.943339 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.943378 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.943388 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.943403 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:29 crc kubenswrapper[4677]: I1007 13:08:29.943411 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:29Z","lastTransitionTime":"2025-10-07T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.046073 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.046130 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.046141 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.046158 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.046169 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.148844 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.148918 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.148935 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.148959 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.148977 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.251696 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.251739 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.251751 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.251768 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.251778 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.302198 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:30 crc kubenswrapper[4677]: E1007 13:08:30.302510 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.354767 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.354843 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.354867 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.354893 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.354911 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.458056 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.458122 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.458145 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.458173 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.458193 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.560700 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.560784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.560811 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.560841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.560863 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.665527 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.665588 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.665605 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.665629 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.665647 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.769554 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.769613 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.769632 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.769657 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.769674 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.873093 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.873159 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.873181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.873212 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.873233 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.976320 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.976392 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.976410 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.976476 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:30 crc kubenswrapper[4677]: I1007 13:08:30.976502 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:30Z","lastTransitionTime":"2025-10-07T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.080207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.080539 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.080567 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.080599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.080617 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.183960 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.184027 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.184044 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.184072 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.184090 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.287404 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.287507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.287530 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.287561 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.287588 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.302136 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.302182 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:31 crc kubenswrapper[4677]: E1007 13:08:31.302359 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.302420 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:31 crc kubenswrapper[4677]: E1007 13:08:31.302643 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:31 crc kubenswrapper[4677]: E1007 13:08:31.302765 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.390535 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.390596 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.390616 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.390639 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.390656 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.494025 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.494081 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.494100 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.494124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.494141 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.597690 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.597791 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.597818 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.597846 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.597868 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.700739 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.700813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.700837 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.700873 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.700903 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.804527 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.804596 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.804622 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.804647 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.804664 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.908153 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.908211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.908228 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.908252 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:31 crc kubenswrapper[4677]: I1007 13:08:31.908268 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:31Z","lastTransitionTime":"2025-10-07T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.011809 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.011877 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.011919 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.011945 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.011965 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.115886 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.115980 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.116006 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.116039 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.116065 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.219378 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.219465 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.219482 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.219508 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.219526 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.302491 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:32 crc kubenswrapper[4677]: E1007 13:08:32.302667 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.322227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.322287 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.322305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.322332 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.322349 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.425330 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.425395 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.425476 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.425519 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.425541 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.527892 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.527978 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.527995 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.528018 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.528032 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.630839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.630872 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.630884 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.630899 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.630910 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.733246 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.733303 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.733350 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.733380 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.733397 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.836347 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.836789 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.836814 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.836841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.836863 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.939903 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.939961 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.939974 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.939991 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:32 crc kubenswrapper[4677]: I1007 13:08:32.940001 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:32Z","lastTransitionTime":"2025-10-07T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.042295 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.042328 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.042337 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.042349 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.042358 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.110334 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.110583 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.110549739 +0000 UTC m=+148.596258904 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.144948 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.145017 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.145040 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.145069 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.145090 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.211486 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.211547 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.211656 4677 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.211700 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.211687713 +0000 UTC m=+148.697396828 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.211700 4677 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.211808 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.211780326 +0000 UTC m=+148.697489471 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.248002 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.248063 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.248079 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.248115 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.248131 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.302960 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.303236 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.303742 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.303946 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.303744 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.304791 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.305257 4677 scope.go:117] "RemoveContainer" containerID="d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.312978 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.313095 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313185 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313222 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313240 4677 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313293 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313321 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.313297322 +0000 UTC m=+148.799006447 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313327 4677 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313358 4677 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:33 crc kubenswrapper[4677]: E1007 13:08:33.313469 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.313407125 +0000 UTC m=+148.799116280 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.351266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.351303 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.351314 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.351332 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.351344 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.454469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.454509 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.454520 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.454540 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.454553 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.557506 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.557569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.557585 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.557609 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.557624 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.660100 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.660147 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.660163 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.660181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.660198 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.762698 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.762752 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.762768 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.762792 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.762809 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.832913 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/2.log" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.835452 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.835816 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.852657 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.865123 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.865159 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.865167 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.865180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.865190 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.869313 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.881372 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.894857 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.909902 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.928058 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.947322 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.967144 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.967473 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.967503 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.967513 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.967530 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.967542 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:33Z","lastTransitionTime":"2025-10-07T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.983850 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:33 crc kubenswrapper[4677]: I1007 13:08:33.994103 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.004623 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.015904 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.033038 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.058191 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.070533 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.070572 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.070585 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.070603 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.070615 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.072009 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.081517 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfed5e4c-bba3-4aab-86f7-27b722b12d83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373e947054a32f5e1ecb5b66d2a5e668a14a1c76b2329cc4a60ddee65c80a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.093283 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.103763 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.112314 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.173256 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.173306 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.173319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.173336 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.173347 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.276368 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.276771 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.276784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.276801 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.276817 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.302930 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:34 crc kubenswrapper[4677]: E1007 13:08:34.303098 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.380100 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.380236 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.380266 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.380296 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.380316 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.483788 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.483842 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.483857 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.483877 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.483890 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.587315 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.587377 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.587394 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.587419 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.587465 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.691180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.691244 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.691261 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.691284 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.691304 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.793724 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.793776 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.793792 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.793815 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.793831 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.843112 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/3.log" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.843954 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/2.log" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.846668 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f" exitCode=1 Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.846707 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.846743 4677 scope.go:117] "RemoveContainer" containerID="d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.847851 4677 scope.go:117] "RemoveContainer" containerID="b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f" Oct 07 13:08:34 crc kubenswrapper[4677]: E1007 13:08:34.848136 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.868297 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.884822 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.896562 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.896638 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.896661 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.896686 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.896705 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.903069 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.920486 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.940047 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41f7a2621bf9c7b0e852cc4fb11dc29a2856c5b6519628df5c77b064c310b1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:06Z\\\",\\\"message\\\":\\\"release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00767ca9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1007 13:08:06.342882 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:34Z\\\",\\\"message\\\":\\\"_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:08:34.206383 6814 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:08:34.206522 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to sh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.956309 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.973765 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.990132 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:34Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.999316 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.999356 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.999370 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.999393 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:34 crc kubenswrapper[4677]: I1007 13:08:34.999408 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:34Z","lastTransitionTime":"2025-10-07T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.006463 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.020653 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.044028 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.074590 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.094309 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.103004 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.103056 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.103074 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.103097 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.103113 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.110999 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfed5e4c-bba3-4aab-86f7-27b722b12d83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373e947054a32f5e1ecb5b66d2a5e668a14a1c76b2329cc4a60ddee65c80a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.130130 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.148581 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.166736 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.187817 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.205870 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.206300 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.206343 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.206360 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.206381 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.206395 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.302166 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.302239 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.302233 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:35 crc kubenswrapper[4677]: E1007 13:08:35.302358 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:35 crc kubenswrapper[4677]: E1007 13:08:35.302677 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:35 crc kubenswrapper[4677]: E1007 13:08:35.302792 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.308934 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.308997 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.309017 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.309046 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.309066 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.412684 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.412743 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.412757 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.412776 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.412788 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.515471 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.515538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.515554 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.515579 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.515596 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.618501 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.618557 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.618574 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.618599 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.618615 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.721469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.721543 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.721567 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.721598 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.721620 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.824914 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.825017 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.825059 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.825095 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.825117 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.853588 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/3.log" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.859412 4677 scope.go:117] "RemoveContainer" containerID="b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f" Oct 07 13:08:35 crc kubenswrapper[4677]: E1007 13:08:35.859707 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.880661 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.898524 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.918887 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.929017 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.929073 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.929088 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.929108 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.929121 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:35Z","lastTransitionTime":"2025-10-07T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.935132 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.951140 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.967783 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:35 crc kubenswrapper[4677]: I1007 13:08:35.985302 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.001810 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:35Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.024136 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:34Z\\\",\\\"message\\\":\\\"_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:08:34.206383 6814 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:08:34.206522 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to sh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.031741 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.031778 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.031790 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.031805 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.031818 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.039904 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.059491 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.073973 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.097290 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.119668 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.134737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.135091 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.135295 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.135541 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.135698 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.137458 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.169480 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.191204 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.206570 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfed5e4c-bba3-4aab-86f7-27b722b12d83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373e947054a32f5e1ecb5b66d2a5e668a14a1c76b2329cc4a60ddee65c80a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.227185 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.238144 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.238201 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.238220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.238244 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.238264 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.302247 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:36 crc kubenswrapper[4677]: E1007 13:08:36.302421 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.341564 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.341646 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.341673 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.341710 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.341734 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.445085 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.445160 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.445187 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.445218 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.445240 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.547752 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.547806 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.547825 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.547846 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.547863 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.651351 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.651388 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.651396 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.651408 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.651415 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.753269 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.753304 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.753312 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.753325 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.753333 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.855752 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.855815 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.855830 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.855851 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.855867 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.957899 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.957944 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.957954 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.957969 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.957980 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.984061 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.984116 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.984132 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.984157 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:36 crc kubenswrapper[4677]: I1007 13:08:36.984174 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:36Z","lastTransitionTime":"2025-10-07T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.003951 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.008834 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.008884 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.008907 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.008937 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.008956 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.031278 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.036496 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.036573 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.036593 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.036619 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.036636 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.056082 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.061041 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.061092 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.061109 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.061133 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.061150 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.082309 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.087537 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.087592 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.087611 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.087634 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.087651 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.107202 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:37Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.107462 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.109329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.109384 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.109401 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.109425 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.109474 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.212329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.212393 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.212414 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.212472 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.212494 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.302500 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.302517 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.302607 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.302793 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.302887 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:37 crc kubenswrapper[4677]: E1007 13:08:37.303002 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.315149 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.315207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.315230 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.315259 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.315282 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.418248 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.418325 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.418360 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.418390 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.418411 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.521825 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.521892 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.521910 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.521934 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.521949 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.625390 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.625500 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.625524 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.625551 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.625569 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.728492 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.728564 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.728581 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.728610 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.728631 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.831136 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.831215 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.831240 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.831267 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.831289 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.934279 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.934351 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.934373 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.934402 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:37 crc kubenswrapper[4677]: I1007 13:08:37.934423 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:37Z","lastTransitionTime":"2025-10-07T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.037207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.037298 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.037322 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.037347 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.037364 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.145077 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.145170 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.145199 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.145355 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.145477 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.249526 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.249607 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.249632 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.249665 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.249688 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.302556 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:38 crc kubenswrapper[4677]: E1007 13:08:38.302747 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.352903 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.352974 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.352993 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.353018 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.353038 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.456829 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.457210 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.457463 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.457526 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.457551 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.561152 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.561657 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.561816 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.561985 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.562137 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.667128 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.667554 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.667706 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.667840 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.667983 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.772494 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.772577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.772598 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.772624 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.772641 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.876172 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.876215 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.876227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.876246 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.876257 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.980734 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.980790 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.980809 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.980834 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:38 crc kubenswrapper[4677]: I1007 13:08:38.980854 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:38Z","lastTransitionTime":"2025-10-07T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.083346 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.083543 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.083570 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.083598 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.083616 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.186130 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.186206 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.186227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.186256 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.186275 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.289006 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.289044 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.289052 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.289087 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.289104 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.302749 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.303088 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.303238 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:39 crc kubenswrapper[4677]: E1007 13:08:39.303796 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:39 crc kubenswrapper[4677]: E1007 13:08:39.303817 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:39 crc kubenswrapper[4677]: E1007 13:08:39.303890 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.320851 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.337556 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.356768 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.371590 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.391872 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.392095 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.392137 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.392148 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.392166 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.392180 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.422930 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:34Z\\\",\\\"message\\\":\\\"_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:08:34.206383 6814 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:08:34.206522 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to sh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.442920 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.460727 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.481393 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.494229 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.494277 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.494291 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.494344 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.494359 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.495389 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.520155 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.532892 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.567267 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.590225 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.596999 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.597048 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.597066 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.597090 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.597107 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.605284 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfed5e4c-bba3-4aab-86f7-27b722b12d83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373e947054a32f5e1ecb5b66d2a5e668a14a1c76b2329cc4a60ddee65c80a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.622767 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.639733 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.654391 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.671221 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.704653 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.704709 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.704732 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.704760 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.704782 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.808051 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.808131 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.808154 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.808185 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.808209 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.910173 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.910232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.910250 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.910273 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:39 crc kubenswrapper[4677]: I1007 13:08:39.910294 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:39Z","lastTransitionTime":"2025-10-07T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.013372 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.013780 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.014170 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.014340 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.014546 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.118095 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.118147 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.118162 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.118185 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.118202 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.221030 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.221172 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.221251 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.221289 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.221315 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.303160 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:40 crc kubenswrapper[4677]: E1007 13:08:40.303387 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.325020 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.325082 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.325099 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.325123 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.325140 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.428099 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.428174 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.428192 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.428217 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.428234 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.531132 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.531203 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.531227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.531258 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.531281 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.634566 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.634627 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.634646 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.634672 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.634690 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.738324 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.738396 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.738421 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.738492 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.738512 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.841955 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.842011 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.842031 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.842053 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.842070 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.945771 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.945839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.945861 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.945892 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:40 crc kubenswrapper[4677]: I1007 13:08:40.945914 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:40Z","lastTransitionTime":"2025-10-07T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.049800 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.049868 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.049890 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.049913 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.049929 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.153496 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.153538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.153548 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.153564 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.153573 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.257107 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.257177 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.257198 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.257226 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.257245 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.302187 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.302234 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:41 crc kubenswrapper[4677]: E1007 13:08:41.302554 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.302592 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:41 crc kubenswrapper[4677]: E1007 13:08:41.302926 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:41 crc kubenswrapper[4677]: E1007 13:08:41.303123 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.359925 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.359977 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.359994 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.360019 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.360037 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.462964 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.463083 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.463102 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.463126 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.463142 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.567113 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.567191 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.567213 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.567240 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.567260 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.670264 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.670325 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.670342 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.670377 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.670396 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.773547 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.773606 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.773623 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.773647 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.773665 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.876756 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.876818 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.876842 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.876872 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.876895 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.979234 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.979284 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.979301 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.979326 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:41 crc kubenswrapper[4677]: I1007 13:08:41.979342 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:41Z","lastTransitionTime":"2025-10-07T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.082007 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.082078 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.082101 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.082130 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.082152 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.185715 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.185788 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.185813 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.185845 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.185870 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.288570 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.288646 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.288667 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.288695 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.288712 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.302130 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:42 crc kubenswrapper[4677]: E1007 13:08:42.302296 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.391422 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.391529 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.391554 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.391581 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.391598 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.494998 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.495068 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.495091 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.495120 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.495140 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.598140 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.598200 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.598211 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.598228 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.598241 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.701251 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.701338 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.701364 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.701395 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.701418 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.804504 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.804569 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.804586 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.804609 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.804627 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.907583 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.907652 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.907677 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.907711 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:42 crc kubenswrapper[4677]: I1007 13:08:42.907734 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:42Z","lastTransitionTime":"2025-10-07T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.011797 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.011898 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.011954 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.011989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.012015 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.115387 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.115491 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.115511 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.115536 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.115553 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.217993 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.218124 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.218144 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.218169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.218187 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.302898 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.302919 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.303024 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:43 crc kubenswrapper[4677]: E1007 13:08:43.303282 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:43 crc kubenswrapper[4677]: E1007 13:08:43.303389 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:43 crc kubenswrapper[4677]: E1007 13:08:43.303617 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.320013 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.320064 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.320081 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.320102 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.320118 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.423321 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.423401 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.423421 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.423491 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.423526 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.526833 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.526894 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.526917 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.526949 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.526970 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.629237 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.629306 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.629331 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.629363 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.629392 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.732088 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.732181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.732207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.732236 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.732253 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.835040 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.835091 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.835107 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.835127 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.835138 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.938523 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.938590 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.938607 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.938632 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:43 crc kubenswrapper[4677]: I1007 13:08:43.938649 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:43Z","lastTransitionTime":"2025-10-07T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.041349 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.041486 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.041523 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.041555 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.041576 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.144192 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.144260 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.144280 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.144304 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.144322 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.247933 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.248040 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.248108 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.248216 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.248245 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.302748 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:44 crc kubenswrapper[4677]: E1007 13:08:44.302927 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.351153 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.351213 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.351230 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.351254 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.351273 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.454189 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.454253 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.454265 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.454286 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.454299 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.557060 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.557140 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.557165 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.557195 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.557214 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.659539 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.659597 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.659614 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.659637 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.659654 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.762838 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.762889 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.762906 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.762930 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.762948 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.866992 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.867066 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.867084 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.867110 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.867128 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.971871 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.971930 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.971947 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.971971 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:44 crc kubenswrapper[4677]: I1007 13:08:44.971990 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:44Z","lastTransitionTime":"2025-10-07T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.075822 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.075910 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.075924 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.075951 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.075965 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.179011 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.179136 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.179157 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.179181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.179202 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.282186 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.282243 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.282267 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.282297 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.282319 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.303024 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.303081 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:45 crc kubenswrapper[4677]: E1007 13:08:45.303206 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.303273 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:45 crc kubenswrapper[4677]: E1007 13:08:45.303376 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:45 crc kubenswrapper[4677]: E1007 13:08:45.303578 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.385637 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.385705 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.385730 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.385758 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.385777 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.488784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.488827 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.488845 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.488867 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.488885 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.592526 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.592604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.592626 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.592659 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.592680 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.695465 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.695529 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.695553 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.695582 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.695602 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.798230 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.798300 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.798398 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.798543 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.798581 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.901610 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.901681 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.901706 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.901760 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:45 crc kubenswrapper[4677]: I1007 13:08:45.901786 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:45Z","lastTransitionTime":"2025-10-07T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.004300 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.004368 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.004390 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.004427 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.004496 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.107721 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.107784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.107801 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.107821 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.107836 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.210707 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.210786 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.210805 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.210828 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.210846 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.302654 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:46 crc kubenswrapper[4677]: E1007 13:08:46.303066 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.313579 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.313752 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.313931 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.314064 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.314201 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.416670 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.416750 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.416770 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.416803 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.416825 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.519991 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.520053 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.520070 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.520094 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.520110 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.623183 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.623251 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.623274 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.623305 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.623328 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.726695 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.726735 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.726747 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.726763 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.726776 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.829765 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.829838 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.829863 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.829892 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.829917 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.932516 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.932583 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.932600 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.932625 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:46 crc kubenswrapper[4677]: I1007 13:08:46.932643 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:46Z","lastTransitionTime":"2025-10-07T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.035805 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.035866 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.035891 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.035921 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.035944 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.138844 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.138897 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.138914 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.138934 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.138948 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.159948 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.160093 4677 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.160165 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs podName:f63a77a6-7e4a-4ed0-a996-b8f80233d10c nodeName:}" failed. No retries permitted until 2025-10-07 13:09:51.160144242 +0000 UTC m=+162.645853367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs") pod "network-metrics-daemon-8bljr" (UID: "f63a77a6-7e4a-4ed0-a996-b8f80233d10c") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.241935 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.241992 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.242015 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.242042 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.242062 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.302739 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.302814 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.302826 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.302961 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.303088 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.303344 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.311349 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.311403 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.311424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.311488 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.311512 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.333084 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:47Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.338700 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.338741 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.338756 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.338776 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.338791 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.358507 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:47Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.363297 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.363350 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.363372 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.363397 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.363420 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.383819 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:47Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.388567 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.388631 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.388655 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.388681 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.388702 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.408029 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:47Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.412469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.412523 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.412577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.412600 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.412617 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.433673 4677 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2461c0fe-8a8b-483d-90f2-2a3d8d7aca47\\\",\\\"systemUUID\\\":\\\"68c6c527-b248-4c1e-9fd2-b44685e78bcf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:47Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:47 crc kubenswrapper[4677]: E1007 13:08:47.433899 4677 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.436166 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.436223 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.436242 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.436269 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.436286 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.539300 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.539363 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.539380 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.539406 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.539423 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.642896 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.642947 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.642964 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.642986 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.643004 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.746052 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.746125 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.746150 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.746180 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.746203 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.849201 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.849257 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.849276 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.849301 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.849320 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.952360 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.952393 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.952424 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.952473 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:47 crc kubenswrapper[4677]: I1007 13:08:47.952485 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:47Z","lastTransitionTime":"2025-10-07T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.055988 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.056052 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.056073 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.056099 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.056117 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.158915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.158996 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.159020 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.159057 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.159079 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.261506 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.261551 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.261563 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.261580 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.261591 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.302574 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:48 crc kubenswrapper[4677]: E1007 13:08:48.302932 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.365821 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.365889 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.365909 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.365941 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.365965 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.469475 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.469547 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.469574 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.469604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.469625 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.573140 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.573218 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.573240 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.573269 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.573294 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.677215 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.677289 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.677312 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.677342 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.677366 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.781727 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.781802 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.781827 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.781859 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.781885 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.885192 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.885262 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.885284 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.885311 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.885328 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.988527 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.988589 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.988609 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.988636 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:48 crc kubenswrapper[4677]: I1007 13:08:48.988654 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:48Z","lastTransitionTime":"2025-10-07T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.092408 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.092554 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.092577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.092603 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.092620 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.194992 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.195083 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.195103 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.195123 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.195139 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.297902 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.297970 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.297984 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.298030 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.298048 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.302559 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.302630 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:49 crc kubenswrapper[4677]: E1007 13:08:49.302714 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.302831 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:49 crc kubenswrapper[4677]: E1007 13:08:49.302943 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:49 crc kubenswrapper[4677]: E1007 13:08:49.303138 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.321939 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.340539 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49f4b2f98de9e297e6a31a5583120192adf9a013700b49bb419e54d9e75fdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.356254 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8bljr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rc97j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8bljr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.371778 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c40c47d2-50a4-43f1-9b6e-08b60a3260c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f36e52a7e88b59d8fd38c1fe659ce9b539e514c9d31e326a3ed647ebb8d19781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b1c015461fecca9e5122abe950f33e24f4b7188568ea84cb059a08a4637963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ecf81a2a9f147c0d9643f8e6c45248164053203ca4e5bbdc57c38e5803a5386\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8dc3f8bdc52104efdb49a017d6497e2aaa3ed2b593794413fcd1acf2e06d36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.390458 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.400117 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.400178 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.400197 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.400217 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.400262 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.406767 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5b2cfaaf4533573a7cdf927cb9a0b61690f4f04ca22f5da5013fd218ee2cba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r59hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r7cnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.435964 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3458826a-000d-407d-92c8-236d1a05842e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:34Z\\\",\\\"message\\\":\\\"_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:08:34.206383 6814 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:08:34.206522 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to sh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm7l2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29c8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.454577 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pjgpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73bebfb3-50b5-48b6-b348-1d1feb6202d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:08:16Z\\\",\\\"message\\\":\\\"2025-10-07T13:07:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6\\\\n2025-10-07T13:07:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2e9730f-3112-4cf9-bde0-fbf3e73808c6 to /host/opt/cni/bin/\\\\n2025-10-07T13:07:31Z [verbose] multus-daemon started\\\\n2025-10-07T13:07:31Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:08:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h59cg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pjgpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.472605 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea5a5436-29f6-4edd-9d4d-22eb9dd828c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a8e8a31adbc84ed02ff984941bb00da95740b19e8717fc6d4fb39b62338973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97188126bcea5ad3844f74c9402e831926e1142944778240b4d4b26da7ea40c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qf2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.489948 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c3400d7-6126-498b-ba93-b88903b8d698\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffff3685813b0115a56c61e90cb80318d0265429d9be16eeb9a4d0870ec2a442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7bfbc2ad9b8a554ee30118d74323aeddc334939ab97ab61cbd5eb24ae1db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50c0057805c0f200830303329e9c3c8c75c20246ace7131caff6afb6aca6f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd7e95c3dad799fcc99041e53970f7f6be7e9f4280d724394e9a06051043706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.503018 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.503167 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.503193 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.503224 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.503249 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.509536 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9347ca53bc58ad2e19bdbccd5eb40fde5ef36cdc0c2a2899e7e86977208446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7932ee6d24ab75f34dabb17b5c2732dc1437e94b4fab6cace5c5bf4d8b4a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.526025 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c2h2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6a7b491-6ed9-4906-8d2d-d8913a581b95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079f44a0676fd6e659268707658dfce76f5c80881ebd1b7f77b831a653002cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gh4gv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c2h2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.546128 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-czmsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f7f734-b59a-447c-b4a5-5aeb78d3a4dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620c115cda692d86e1c655fe633ade8d56b4ad3faff70ec3383e0d6931e91acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e2aad54ae491e74845fbb681491bde96694b5f242d15a1b4c0f62b81048e7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2dfd4b5d1804d9d8e9ccf4dea8fb23cf60cd039e32d4bfc7d0b0e189da178b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c337ed2525a5d0aca41ff193138829ecd98172765110f97081d5a9b57ea39519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b53d862bdcbdf9ebb58ca6717073ecf18236be981bf633bf22e3a077fa6ed90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de85e8780fadb509355343e297ea73c2bd1047219be9911d445e044d40d378b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6309cf358d3d85018e9e89fdb974146ab00944de09d3b38cfbf357df59b442f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rstqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-czmsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.576008 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9c35782-52f8-4fbc-9e52-07ee92002e3d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9182677b05c8d32f333b4e806b6dc29e0ce3f6171616ed303459ccb6a3754a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550a4491cebcbd8b3a62831cce07b13bb79051cd51505aab1f74bcfee692f7b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d90d7cc786a9f94c269a99be97c00685a2e10bde12e0afe4db2de40b95749a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99541a9f53339e760fb1074be18ebfcb8b225c64c290478559d2e3722ba9296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cec6b690e4d01d9d206a812f278832d622a7bdfb74ddcfb5904e19f721fae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16639187300b01b7c9f30dda26da7c1de9de9c66baf7ad716875eba41a5d7d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dab4e59081dcacddedf345841dce8c940fae52aeee55d6c956e1364dd70872b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abcfe5aa83c73ae24b7f0b414b47344969924ca50a5e1669bfb4704f1386cd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.595805 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ad65790-6a90-4c21-b5c5-ac1ddf2cbe52\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b15b8f71a10920a74f784c3440031e14726f661e10f628b269da08e70a7cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a51c6d4e82136b444754dc679f864558f74624af4ff94f794e473c92c8f6c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d7ddc445a61d5fd4959d1b3b4e2c93503111a12f461d945dd298a3f8540f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a4e9dcaf4f4585c45625444ba093f84acc83f03e96235efd054bed4a38fc21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f3139524d94a487c756b4a7e3660c0a4380bcba8aa8b588368530f43aab0b1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:07:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 13:07:25.065953 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:07:25.066115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:07:25.067558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2107846005/tls.crt::/tmp/serving-cert-2107846005/tls.key\\\\\\\"\\\\nI1007 13:07:25.378394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 13:07:25.383525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 13:07:25.383565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 13:07:25.383605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 13:07:25.383617 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 13:07:25.390977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1007 13:07:25.391014 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1007 13:07:25.391020 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391038 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 13:07:25.391053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 13:07:25.391061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 13:07:25.391071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 13:07:25.391088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1007 13:07:25.394664 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b92962cb41f37a615f473651c01e37f5d53e01f3fb4b7c0eb2092095bb55239\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d927fbc5242eb951e8c2ec6d2859315f34e4b190bb43e646044111d1f583bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.606024 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.606070 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.606089 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.606112 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.606129 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.609842 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfed5e4c-bba3-4aab-86f7-27b722b12d83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373e947054a32f5e1ecb5b66d2a5e668a14a1c76b2329cc4a60ddee65c80a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70bca62773a15d295207b342b32a4263173ee7ebee7222bb16204210e168a52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:07:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.625120 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef29067dc23263a94c4f861ada9ebbe04aae442de3da9fa34db521177f60ce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.639268 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.655345 4677 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8xd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f78c6e9a-e5e3-4296-b2e6-3ba36d1808ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e7ebbc9f01ac7f853075c65c8cc57c691cf3f95e41036294486ad4a3bb807c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:07:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8xd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:08:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.708729 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.708784 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.708827 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.708889 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.708910 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.812408 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.812511 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.812529 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.812553 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.812572 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.916085 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.916214 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.916235 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.916294 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:49 crc kubenswrapper[4677]: I1007 13:08:49.916322 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:49Z","lastTransitionTime":"2025-10-07T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.020196 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.020295 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.020320 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.020388 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.020415 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.122989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.123057 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.123075 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.123096 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.123114 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.226818 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.226883 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.226924 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.226960 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.227218 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.302993 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:50 crc kubenswrapper[4677]: E1007 13:08:50.303200 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.304180 4677 scope.go:117] "RemoveContainer" containerID="b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f" Oct 07 13:08:50 crc kubenswrapper[4677]: E1007 13:08:50.304467 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.330317 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.330369 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.330380 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.330398 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.330411 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.433691 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.433737 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.433746 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.433763 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.433773 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.536140 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.536232 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.536243 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.536286 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.536298 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.639948 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.640031 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.640070 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.640102 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.640127 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.743069 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.743136 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.743160 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.743191 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.743206 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.846741 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.846918 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.846944 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.846978 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.847002 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.950413 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.950507 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.950524 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.950548 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:50 crc kubenswrapper[4677]: I1007 13:08:50.950566 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:50Z","lastTransitionTime":"2025-10-07T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.052862 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.052915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.052926 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.052944 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.052956 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.155339 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.155384 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.155394 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.155407 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.155417 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.258350 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.258425 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.258462 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.258479 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.258491 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.302320 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.302363 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.302654 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:51 crc kubenswrapper[4677]: E1007 13:08:51.302822 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:51 crc kubenswrapper[4677]: E1007 13:08:51.302966 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:51 crc kubenswrapper[4677]: E1007 13:08:51.303057 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.360537 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.360581 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.360595 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.360619 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.360641 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.464715 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.464785 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.464802 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.464831 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.464848 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.568175 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.568238 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.568250 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.568273 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.568286 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.671541 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.671582 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.671591 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.671607 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.671617 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.774527 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.774598 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.774617 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.774642 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.774662 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.877715 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.877765 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.877775 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.877791 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.877802 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.981219 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.981261 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.981272 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.981292 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:51 crc kubenswrapper[4677]: I1007 13:08:51.981347 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:51Z","lastTransitionTime":"2025-10-07T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.085006 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.085071 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.085093 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.085118 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.085137 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.188078 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.188171 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.188195 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.188225 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.188247 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.291227 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.291290 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.291306 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.291329 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.291346 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.303102 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:52 crc kubenswrapper[4677]: E1007 13:08:52.303291 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.394769 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.394852 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.394926 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.394959 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.394981 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.498104 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.498168 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.498190 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.498220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.498240 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.600824 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.600891 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.600908 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.600933 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.600951 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.704591 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.704656 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.704676 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.704701 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.704719 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.807680 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.807768 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.807781 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.807804 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.807819 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.910590 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.910641 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.910652 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.910668 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:52 crc kubenswrapper[4677]: I1007 13:08:52.910679 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:52Z","lastTransitionTime":"2025-10-07T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.013588 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.013645 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.013661 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.013684 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.013704 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.117834 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.117903 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.117919 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.117943 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.117968 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.220751 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.220807 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.220829 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.220858 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.220878 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.302683 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.302796 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:53 crc kubenswrapper[4677]: E1007 13:08:53.302868 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.302661 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:53 crc kubenswrapper[4677]: E1007 13:08:53.302978 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:53 crc kubenswrapper[4677]: E1007 13:08:53.303146 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.322942 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.323223 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.323340 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.323469 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.323598 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.426634 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.426694 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.426714 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.426773 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.426791 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.530140 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.530179 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.530190 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.530207 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.530220 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.633181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.633229 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.633241 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.633258 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.633272 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.735285 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.735327 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.735338 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.735354 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.735364 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.838094 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.838160 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.838181 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.838206 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.838224 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.941107 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.941199 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.941220 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.941245 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:53 crc kubenswrapper[4677]: I1007 13:08:53.941262 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:53Z","lastTransitionTime":"2025-10-07T13:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.048075 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.048169 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.048191 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.048222 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.048243 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.151158 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.151239 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.151263 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.151293 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.151318 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.254748 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.254803 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.254819 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.254841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.254858 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.302359 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:54 crc kubenswrapper[4677]: E1007 13:08:54.302785 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.357861 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.357983 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.358012 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.358039 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.358060 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.461690 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.461815 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.461839 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.461863 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.461880 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.566747 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.566809 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.566827 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.566851 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.566870 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.671700 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.671778 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.671798 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.671829 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.671853 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.775776 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.776009 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.776042 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.776072 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.776106 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.879358 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.879491 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.879577 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.879699 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.879771 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.982688 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.982742 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.982755 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.982774 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:54 crc kubenswrapper[4677]: I1007 13:08:54.982787 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:54Z","lastTransitionTime":"2025-10-07T13:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.086271 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.086368 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.086391 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.086421 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.086497 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.189606 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.189662 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.189680 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.189701 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.189717 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.292841 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.292905 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.292930 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.292960 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.292981 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.302842 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.302842 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.302927 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:55 crc kubenswrapper[4677]: E1007 13:08:55.303102 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:55 crc kubenswrapper[4677]: E1007 13:08:55.303491 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:55 crc kubenswrapper[4677]: E1007 13:08:55.303779 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.395641 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.395741 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.395761 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.395783 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.395800 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.499052 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.499118 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.499138 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.499165 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.499186 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.602754 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.602869 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.602888 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.602913 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.602931 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.705408 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.705491 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.705510 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.705532 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.705551 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.808860 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.808929 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.808952 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.808981 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.809003 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.912426 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.912542 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.912566 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.912597 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:55 crc kubenswrapper[4677]: I1007 13:08:55.912619 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:55Z","lastTransitionTime":"2025-10-07T13:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.016226 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.016381 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.016409 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.016468 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.016489 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.119527 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.119585 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.119604 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.119628 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.119648 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.221989 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.222054 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.222074 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.222099 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.222116 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.302665 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:56 crc kubenswrapper[4677]: E1007 13:08:56.303162 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.325285 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.325354 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.325370 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.325396 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.325414 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.427896 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.427955 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.427966 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.427982 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.427995 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.531261 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.531340 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.531363 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.531391 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.531408 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.634335 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.634401 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.634418 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.634485 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.634502 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.737661 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.737730 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.737748 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.737777 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.737796 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.841166 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.841247 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.841268 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.841291 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.841310 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.943856 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.943915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.943932 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.943957 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:56 crc kubenswrapper[4677]: I1007 13:08:56.943976 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:56Z","lastTransitionTime":"2025-10-07T13:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.047287 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.047341 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.047358 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.047382 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.047398 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:57Z","lastTransitionTime":"2025-10-07T13:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.150210 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.150279 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.150295 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.150319 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.150347 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:57Z","lastTransitionTime":"2025-10-07T13:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.258632 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.259017 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.259246 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.259538 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.259558 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:57Z","lastTransitionTime":"2025-10-07T13:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.302255 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.302380 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.302285 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:57 crc kubenswrapper[4677]: E1007 13:08:57.302520 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:57 crc kubenswrapper[4677]: E1007 13:08:57.302601 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:57 crc kubenswrapper[4677]: E1007 13:08:57.302721 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.363521 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.363605 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.363630 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.363660 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.363682 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:57Z","lastTransitionTime":"2025-10-07T13:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.466915 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.466968 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.466986 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.467012 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.467029 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:57Z","lastTransitionTime":"2025-10-07T13:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.570571 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.570631 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.570648 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.570674 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.570693 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:57Z","lastTransitionTime":"2025-10-07T13:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.612840 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.612929 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.612962 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.612993 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.613014 4677 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:08:57Z","lastTransitionTime":"2025-10-07T13:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.713779 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6"] Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.714202 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: W1007 13:08:57.716090 4677 reflector.go:561] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": failed to list *v1.Secret: secrets "default-dockercfg-gxtc4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.716128 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 13:08:57 crc kubenswrapper[4677]: E1007 13:08:57.716146 4677 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"default-dockercfg-gxtc4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-gxtc4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.716576 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.717046 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.748359 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=88.748333747 podStartE2EDuration="1m28.748333747s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.735698398 +0000 UTC m=+109.221407523" watchObservedRunningTime="2025-10-07 13:08:57.748333747 +0000 UTC m=+109.234042882" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.793824 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podStartSLOduration=89.793800541 podStartE2EDuration="1m29.793800541s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.763232464 +0000 UTC m=+109.248941639" watchObservedRunningTime="2025-10-07 13:08:57.793800541 +0000 UTC m=+109.279509686" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.809752 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pjgpx" podStartSLOduration=89.809735999 podStartE2EDuration="1m29.809735999s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.809591675 +0000 UTC m=+109.295300830" watchObservedRunningTime="2025-10-07 13:08:57.809735999 +0000 UTC m=+109.295445114" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.836174 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qf2v9" podStartSLOduration=88.836151411 podStartE2EDuration="1m28.836151411s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.82410845 +0000 UTC m=+109.309817585" watchObservedRunningTime="2025-10-07 13:08:57.836151411 +0000 UTC m=+109.321860546" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.861940 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.861918214 podStartE2EDuration="52.861918214s" podCreationTimestamp="2025-10-07 13:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.861550663 +0000 UTC m=+109.347259778" watchObservedRunningTime="2025-10-07 13:08:57.861918214 +0000 UTC m=+109.347627349" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.873271 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d088e23-948f-40c0-a2e5-4d94022456b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.873344 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d088e23-948f-40c0-a2e5-4d94022456b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.873414 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d088e23-948f-40c0-a2e5-4d94022456b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.873528 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d088e23-948f-40c0-a2e5-4d94022456b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.873560 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d088e23-948f-40c0-a2e5-4d94022456b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.887527 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c2h2k" podStartSLOduration=89.887511422 podStartE2EDuration="1m29.887511422s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.887125311 +0000 UTC m=+109.372834436" watchObservedRunningTime="2025-10-07 13:08:57.887511422 +0000 UTC m=+109.373220537" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.941064 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-czmsr" podStartSLOduration=89.941046668 podStartE2EDuration="1m29.941046668s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.91511354 +0000 UTC m=+109.400822695" watchObservedRunningTime="2025-10-07 13:08:57.941046668 +0000 UTC m=+109.426755783" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.941314 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=89.941308136 podStartE2EDuration="1m29.941308136s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.940477901 +0000 UTC m=+109.426187026" watchObservedRunningTime="2025-10-07 13:08:57.941308136 +0000 UTC m=+109.427017251" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.953810 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.953789171 podStartE2EDuration="1m28.953789171s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.95311089 +0000 UTC m=+109.438820055" watchObservedRunningTime="2025-10-07 13:08:57.953789171 +0000 UTC m=+109.439498286" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.968988 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=36.968974026 podStartE2EDuration="36.968974026s" podCreationTimestamp="2025-10-07 13:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:57.968861843 +0000 UTC m=+109.454570988" watchObservedRunningTime="2025-10-07 13:08:57.968974026 +0000 UTC m=+109.454683141" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.974576 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d088e23-948f-40c0-a2e5-4d94022456b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.974643 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d088e23-948f-40c0-a2e5-4d94022456b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.974689 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d088e23-948f-40c0-a2e5-4d94022456b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.974739 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d088e23-948f-40c0-a2e5-4d94022456b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.974773 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d088e23-948f-40c0-a2e5-4d94022456b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.974807 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d088e23-948f-40c0-a2e5-4d94022456b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.975147 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d088e23-948f-40c0-a2e5-4d94022456b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.975758 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d088e23-948f-40c0-a2e5-4d94022456b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:57 crc kubenswrapper[4677]: I1007 13:08:57.980422 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d088e23-948f-40c0-a2e5-4d94022456b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:58 crc kubenswrapper[4677]: I1007 13:08:58.010010 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d088e23-948f-40c0-a2e5-4d94022456b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r8hp6\" (UID: \"7d088e23-948f-40c0-a2e5-4d94022456b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:58 crc kubenswrapper[4677]: I1007 13:08:58.021698 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8xd94" podStartSLOduration=90.021681888 podStartE2EDuration="1m30.021681888s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:58.01211137 +0000 UTC m=+109.497820485" watchObservedRunningTime="2025-10-07 13:08:58.021681888 +0000 UTC m=+109.507391013" Oct 07 13:08:58 crc kubenswrapper[4677]: I1007 13:08:58.302146 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:08:58 crc kubenswrapper[4677]: E1007 13:08:58.302298 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:08:58 crc kubenswrapper[4677]: I1007 13:08:58.711993 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 13:08:58 crc kubenswrapper[4677]: I1007 13:08:58.718530 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" Oct 07 13:08:58 crc kubenswrapper[4677]: I1007 13:08:58.936973 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" event={"ID":"7d088e23-948f-40c0-a2e5-4d94022456b3","Type":"ContainerStarted","Data":"4a598bfc006e6fdea921f7241b78f5144db3e75b053de71227f99d24db06ee5d"} Oct 07 13:08:58 crc kubenswrapper[4677]: I1007 13:08:58.937378 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" event={"ID":"7d088e23-948f-40c0-a2e5-4d94022456b3","Type":"ContainerStarted","Data":"b405d76f769d548e0af808df5244f97f621db1b2c4b292d6cc618ba2dfffdc67"} Oct 07 13:08:59 crc kubenswrapper[4677]: I1007 13:08:59.302952 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:08:59 crc kubenswrapper[4677]: I1007 13:08:59.303132 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:08:59 crc kubenswrapper[4677]: I1007 13:08:59.305229 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:08:59 crc kubenswrapper[4677]: E1007 13:08:59.305221 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:08:59 crc kubenswrapper[4677]: E1007 13:08:59.305387 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:08:59 crc kubenswrapper[4677]: E1007 13:08:59.305540 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:00 crc kubenswrapper[4677]: I1007 13:09:00.302841 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:00 crc kubenswrapper[4677]: E1007 13:09:00.303122 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:01 crc kubenswrapper[4677]: I1007 13:09:01.302161 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:01 crc kubenswrapper[4677]: I1007 13:09:01.302263 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:01 crc kubenswrapper[4677]: E1007 13:09:01.302379 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:01 crc kubenswrapper[4677]: I1007 13:09:01.302400 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:01 crc kubenswrapper[4677]: E1007 13:09:01.302604 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:01 crc kubenswrapper[4677]: E1007 13:09:01.302765 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:02 crc kubenswrapper[4677]: I1007 13:09:02.302424 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:02 crc kubenswrapper[4677]: E1007 13:09:02.302665 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.302685 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.302813 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.302727 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:03 crc kubenswrapper[4677]: E1007 13:09:03.302896 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:03 crc kubenswrapper[4677]: E1007 13:09:03.303049 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:03 crc kubenswrapper[4677]: E1007 13:09:03.303198 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.953740 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/1.log" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.954401 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/0.log" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.954481 4677 generic.go:334] "Generic (PLEG): container finished" podID="73bebfb3-50b5-48b6-b348-1d1feb6202d2" containerID="fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029" exitCode=1 Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.954524 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerDied","Data":"fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029"} Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.954579 4677 scope.go:117] "RemoveContainer" containerID="6b0ac92c71edc3d5107aece2d0e005a546cf25d79d696f4e330b7c0c8babc546" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.955200 4677 scope.go:117] "RemoveContainer" containerID="fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029" Oct 07 13:09:03 crc kubenswrapper[4677]: E1007 13:09:03.955582 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pjgpx_openshift-multus(73bebfb3-50b5-48b6-b348-1d1feb6202d2)\"" pod="openshift-multus/multus-pjgpx" podUID="73bebfb3-50b5-48b6-b348-1d1feb6202d2" Oct 07 13:09:03 crc kubenswrapper[4677]: I1007 13:09:03.976959 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8hp6" podStartSLOduration=95.976940942 podStartE2EDuration="1m35.976940942s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:08:58.963382719 +0000 UTC m=+110.449091924" watchObservedRunningTime="2025-10-07 13:09:03.976940942 +0000 UTC m=+115.462650077" Oct 07 13:09:04 crc kubenswrapper[4677]: I1007 13:09:04.302203 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:04 crc kubenswrapper[4677]: E1007 13:09:04.302369 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:04 crc kubenswrapper[4677]: I1007 13:09:04.303368 4677 scope.go:117] "RemoveContainer" containerID="b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f" Oct 07 13:09:04 crc kubenswrapper[4677]: E1007 13:09:04.303677 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29c8j_openshift-ovn-kubernetes(3458826a-000d-407d-92c8-236d1a05842e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" Oct 07 13:09:04 crc kubenswrapper[4677]: I1007 13:09:04.959891 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/1.log" Oct 07 13:09:05 crc kubenswrapper[4677]: I1007 13:09:05.302863 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:05 crc kubenswrapper[4677]: I1007 13:09:05.302933 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:05 crc kubenswrapper[4677]: I1007 13:09:05.302972 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:05 crc kubenswrapper[4677]: E1007 13:09:05.303067 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:05 crc kubenswrapper[4677]: E1007 13:09:05.303262 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:05 crc kubenswrapper[4677]: E1007 13:09:05.303417 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:06 crc kubenswrapper[4677]: I1007 13:09:06.302745 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:06 crc kubenswrapper[4677]: E1007 13:09:06.302917 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:07 crc kubenswrapper[4677]: I1007 13:09:07.303081 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:07 crc kubenswrapper[4677]: E1007 13:09:07.303331 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:07 crc kubenswrapper[4677]: I1007 13:09:07.303717 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:07 crc kubenswrapper[4677]: E1007 13:09:07.303835 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:07 crc kubenswrapper[4677]: I1007 13:09:07.304187 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:07 crc kubenswrapper[4677]: E1007 13:09:07.304277 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:08 crc kubenswrapper[4677]: I1007 13:09:08.302777 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:08 crc kubenswrapper[4677]: E1007 13:09:08.303330 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:09 crc kubenswrapper[4677]: I1007 13:09:09.302961 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:09 crc kubenswrapper[4677]: I1007 13:09:09.303015 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:09 crc kubenswrapper[4677]: I1007 13:09:09.303079 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:09 crc kubenswrapper[4677]: E1007 13:09:09.305058 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:09 crc kubenswrapper[4677]: E1007 13:09:09.305145 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:09 crc kubenswrapper[4677]: E1007 13:09:09.305221 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:09 crc kubenswrapper[4677]: E1007 13:09:09.339712 4677 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 07 13:09:09 crc kubenswrapper[4677]: E1007 13:09:09.401546 4677 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 13:09:10 crc kubenswrapper[4677]: I1007 13:09:10.303027 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:10 crc kubenswrapper[4677]: E1007 13:09:10.303154 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:11 crc kubenswrapper[4677]: I1007 13:09:11.302686 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:11 crc kubenswrapper[4677]: E1007 13:09:11.302798 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:11 crc kubenswrapper[4677]: I1007 13:09:11.302694 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:11 crc kubenswrapper[4677]: E1007 13:09:11.302967 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:11 crc kubenswrapper[4677]: I1007 13:09:11.303019 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:11 crc kubenswrapper[4677]: E1007 13:09:11.303171 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:12 crc kubenswrapper[4677]: I1007 13:09:12.302073 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:12 crc kubenswrapper[4677]: E1007 13:09:12.302553 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:13 crc kubenswrapper[4677]: I1007 13:09:13.302052 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:13 crc kubenswrapper[4677]: I1007 13:09:13.302120 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:13 crc kubenswrapper[4677]: I1007 13:09:13.302066 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:13 crc kubenswrapper[4677]: E1007 13:09:13.302192 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:13 crc kubenswrapper[4677]: E1007 13:09:13.302400 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:13 crc kubenswrapper[4677]: E1007 13:09:13.302586 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:14 crc kubenswrapper[4677]: I1007 13:09:14.302920 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:14 crc kubenswrapper[4677]: E1007 13:09:14.303180 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:14 crc kubenswrapper[4677]: E1007 13:09:14.403386 4677 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 13:09:15 crc kubenswrapper[4677]: I1007 13:09:15.302553 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:15 crc kubenswrapper[4677]: I1007 13:09:15.302644 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:15 crc kubenswrapper[4677]: I1007 13:09:15.302691 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:15 crc kubenswrapper[4677]: E1007 13:09:15.302817 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:15 crc kubenswrapper[4677]: E1007 13:09:15.303048 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:15 crc kubenswrapper[4677]: E1007 13:09:15.303733 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:15 crc kubenswrapper[4677]: I1007 13:09:15.304059 4677 scope.go:117] "RemoveContainer" containerID="fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029" Oct 07 13:09:15 crc kubenswrapper[4677]: I1007 13:09:15.304271 4677 scope.go:117] "RemoveContainer" containerID="b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f" Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.007337 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/1.log" Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.007761 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerStarted","Data":"cab6ba341a7d3ec923ec6a10fba00b684271e2e0c030e0ed8b119f472414895a"} Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.009534 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/3.log" Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.012741 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerStarted","Data":"4931c26a24a9442024978a83085456f080f6de6d5f334a435bcc6ced01d30f93"} Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.013264 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.062954 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podStartSLOduration=108.062924886 podStartE2EDuration="1m48.062924886s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:16.061560635 +0000 UTC m=+127.547269830" watchObservedRunningTime="2025-10-07 13:09:16.062924886 +0000 UTC m=+127.548634031" Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.183085 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8bljr"] Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.183240 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:16 crc kubenswrapper[4677]: E1007 13:09:16.183364 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:16 crc kubenswrapper[4677]: I1007 13:09:16.302561 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:16 crc kubenswrapper[4677]: E1007 13:09:16.302686 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:17 crc kubenswrapper[4677]: I1007 13:09:17.303024 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:17 crc kubenswrapper[4677]: I1007 13:09:17.303059 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:17 crc kubenswrapper[4677]: E1007 13:09:17.303251 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:17 crc kubenswrapper[4677]: E1007 13:09:17.303335 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:18 crc kubenswrapper[4677]: I1007 13:09:18.302650 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:18 crc kubenswrapper[4677]: E1007 13:09:18.302823 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bljr" podUID="f63a77a6-7e4a-4ed0-a996-b8f80233d10c" Oct 07 13:09:18 crc kubenswrapper[4677]: I1007 13:09:18.302660 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:18 crc kubenswrapper[4677]: E1007 13:09:18.303085 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:09:19 crc kubenswrapper[4677]: I1007 13:09:19.302373 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:19 crc kubenswrapper[4677]: E1007 13:09:19.303677 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:09:19 crc kubenswrapper[4677]: I1007 13:09:19.303835 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:19 crc kubenswrapper[4677]: E1007 13:09:19.304104 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:09:20 crc kubenswrapper[4677]: I1007 13:09:20.302719 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:20 crc kubenswrapper[4677]: I1007 13:09:20.302760 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:20 crc kubenswrapper[4677]: I1007 13:09:20.305972 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 13:09:20 crc kubenswrapper[4677]: I1007 13:09:20.306071 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 13:09:20 crc kubenswrapper[4677]: I1007 13:09:20.306109 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 13:09:20 crc kubenswrapper[4677]: I1007 13:09:20.306204 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 13:09:20 crc kubenswrapper[4677]: I1007 13:09:20.483291 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:09:21 crc kubenswrapper[4677]: I1007 13:09:21.302472 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:21 crc kubenswrapper[4677]: I1007 13:09:21.302508 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:21 crc kubenswrapper[4677]: I1007 13:09:21.305489 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 13:09:21 crc kubenswrapper[4677]: I1007 13:09:21.305745 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.498938 4677 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.544503 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qk4fp"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.544865 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.547266 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b5qm4"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.548028 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.548360 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.549821 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.549900 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.551603 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.552062 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.552091 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.553658 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.556190 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bkhfv"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.557486 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.558730 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.559495 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.561393 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.561545 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.561807 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.562269 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.562459 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.562822 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.564328 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-n2kcb"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.565138 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.570247 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.570957 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.582254 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gspst"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.583321 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.585042 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.588541 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.588776 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.588840 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.588930 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.589291 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.589383 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.589516 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.589571 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.589702 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.589967 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.590005 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.590019 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.590118 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.590457 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.590842 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.591096 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.600649 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.601157 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.601380 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.601566 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.601672 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.601787 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.601951 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.602054 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.602165 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.602273 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.602355 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.604995 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-config\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605042 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-etcd-client\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605073 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwmwr\" (UniqueName: \"kubernetes.io/projected/e5784d1c-d283-408d-a435-66b7cda6ac32-kube-api-access-dwmwr\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605102 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd6kp\" (UniqueName: \"kubernetes.io/projected/d96fbf2b-aad8-46ca-88a1-df1e9624f0ed-kube-api-access-nd6kp\") pod \"cluster-samples-operator-665b6dd947-zvfd4\" (UID: \"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605131 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2aac2a-eb0d-423b-a574-550cfb36adca-serving-cert\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605159 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-config\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605184 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605208 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-encryption-config\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605230 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-etcd-client\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605258 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605283 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ca9be8-efe6-40e8-9f22-75f3e1644622-serving-cert\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605308 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-serving-cert\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605331 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-config\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605352 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-service-ca-bundle\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605378 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-image-import-ca\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605420 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605461 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27ch5\" (UniqueName: \"kubernetes.io/projected/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-kube-api-access-27ch5\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605500 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-config\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605526 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-audit-policies\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605548 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605573 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-dir\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605596 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605620 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2jw\" (UniqueName: \"kubernetes.io/projected/fb2aac2a-eb0d-423b-a574-550cfb36adca-kube-api-access-jd2jw\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605643 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d96fbf2b-aad8-46ca-88a1-df1e9624f0ed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zvfd4\" (UID: \"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605666 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd4zw\" (UniqueName: \"kubernetes.io/projected/d5ca9be8-efe6-40e8-9f22-75f3e1644622-kube-api-access-nd4zw\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605691 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605713 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807933d0-9a58-4191-9bde-74a00551f72e-serving-cert\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605747 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605773 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605797 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svvr2\" (UniqueName: \"kubernetes.io/projected/5172e7b5-3ef1-4f51-8874-8d4ac858284b-kube-api-access-svvr2\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605822 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5xbw\" (UniqueName: \"kubernetes.io/projected/bd1c8146-fe00-4a53-a102-17cfc6ef045b-kube-api-access-d5xbw\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605842 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9vpd\" (UniqueName: \"kubernetes.io/projected/69136696-f636-4c23-b89a-bfbb2eba3a85-kube-api-access-h9vpd\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605867 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605890 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605913 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbdg\" (UniqueName: \"kubernetes.io/projected/807933d0-9a58-4191-9bde-74a00551f72e-kube-api-access-zwbdg\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605938 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.605962 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5172e7b5-3ef1-4f51-8874-8d4ac858284b-config\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.614267 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.614609 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.614758 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.614835 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.614982 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5784d1c-d283-408d-a435-66b7cda6ac32-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615027 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-serving-cert\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615052 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5784d1c-d283-408d-a435-66b7cda6ac32-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615069 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615084 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615110 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615130 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-etcd-serving-ca\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615149 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-client-ca\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615173 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-policies\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615192 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-audit\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615212 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-client-ca\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615232 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-node-pullsecrets\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615251 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-audit-dir\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615310 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615331 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-encryption-config\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615454 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615476 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615473 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615613 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615665 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5172e7b5-3ef1-4f51-8874-8d4ac858284b-images\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615691 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5172e7b5-3ef1-4f51-8874-8d4ac858284b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.615738 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69136696-f636-4c23-b89a-bfbb2eba3a85-audit-dir\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.625347 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.625591 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.629475 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.630145 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.630225 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.631685 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.634137 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.634250 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.634326 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.634423 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.634530 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.634639 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.634944 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635080 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635213 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635390 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635405 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635547 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635648 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635743 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635768 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635788 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635552 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635777 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.635978 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.636390 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mwgkm"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.636764 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.636848 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.637120 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.637179 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mwgkm" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.637498 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.638100 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.638245 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-76928"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.638718 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.640473 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.640834 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.649492 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hn2p2"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.650057 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bldn4"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.650485 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.651037 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.660551 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rkv4v"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.661265 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.666786 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xcssh"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.670807 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.671036 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.671119 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.671591 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.671669 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.671918 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.671943 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.672074 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.672277 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.672504 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.672828 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.672831 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.673094 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.671594 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.673951 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.675370 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.679626 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.697542 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.697755 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.697845 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.698523 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.698788 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.698913 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.699006 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.699099 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.699365 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.699574 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.699832 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700002 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700092 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700170 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700194 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700280 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700378 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700281 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.700545 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.701002 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.702456 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.705339 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.708808 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.710187 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.718897 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719219 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-serving-cert\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719240 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719249 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5784d1c-d283-408d-a435-66b7cda6ac32-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719517 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px67s\" (UniqueName: \"kubernetes.io/projected/579c8074-5c53-4e1f-a620-f04cbccf63aa-kube-api-access-px67s\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719563 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719608 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719630 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-etcd-serving-ca\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719648 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-client-ca\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719668 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edf078f-96f5-46e0-be5c-012e4799f320-config\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719687 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-ca\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719706 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-policies\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719722 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-audit\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719742 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-client-ca\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719763 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxz2\" (UniqueName: \"kubernetes.io/projected/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-kube-api-access-5bxz2\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719784 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-trusted-ca-bundle\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719799 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-oauth-serving-cert\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719822 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-node-pullsecrets\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719840 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-audit-dir\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719856 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579c8074-5c53-4e1f-a620-f04cbccf63aa-config\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719871 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-client\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719888 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7t2\" (UniqueName: \"kubernetes.io/projected/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-kube-api-access-lb7t2\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719915 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-service-ca\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719937 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719955 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-encryption-config\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719974 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pvkj\" (UniqueName: \"kubernetes.io/projected/67be2c46-f396-48d1-ba5e-d21f8362a4dc-kube-api-access-5pvkj\") pod \"downloads-7954f5f757-mwgkm\" (UID: \"67be2c46-f396-48d1-ba5e-d21f8362a4dc\") " pod="openshift-console/downloads-7954f5f757-mwgkm" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.719990 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720012 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720034 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5172e7b5-3ef1-4f51-8874-8d4ac858284b-images\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720056 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5172e7b5-3ef1-4f51-8874-8d4ac858284b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720077 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dd67bc-303c-4a2a-bc68-e430cc5e63c2-metrics-tls\") pod \"dns-operator-744455d44c-xcssh\" (UID: \"07dd67bc-303c-4a2a-bc68-e430cc5e63c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720102 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69136696-f636-4c23-b89a-bfbb2eba3a85-audit-dir\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720126 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nx9t\" (UniqueName: \"kubernetes.io/projected/57faa01e-5137-4dde-9102-e80bb5891cc5-kube-api-access-8nx9t\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720158 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-config\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720174 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-etcd-client\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720193 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwmwr\" (UniqueName: \"kubernetes.io/projected/e5784d1c-d283-408d-a435-66b7cda6ac32-kube-api-access-dwmwr\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720220 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd6kp\" (UniqueName: \"kubernetes.io/projected/d96fbf2b-aad8-46ca-88a1-df1e9624f0ed-kube-api-access-nd6kp\") pod \"cluster-samples-operator-665b6dd947-zvfd4\" (UID: \"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720237 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579c8074-5c53-4e1f-a620-f04cbccf63aa-trusted-ca\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720253 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvl7\" (UniqueName: \"kubernetes.io/projected/7fc2830b-0f4d-4b3b-89bc-6e589839077d-kube-api-access-4vvl7\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720270 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-oauth-config\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720289 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2aac2a-eb0d-423b-a574-550cfb36adca-serving-cert\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720305 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-config\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720321 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6edf078f-96f5-46e0-be5c-012e4799f320-machine-approver-tls\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720337 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-config\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720357 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720380 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-encryption-config\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720399 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-etcd-client\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720417 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc2830b-0f4d-4b3b-89bc-6e589839077d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720457 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720478 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ca9be8-efe6-40e8-9f22-75f3e1644622-serving-cert\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720499 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-serving-cert\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720515 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-config\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720530 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-service-ca-bundle\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720552 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-image-import-ca\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720567 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720585 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27ch5\" (UniqueName: \"kubernetes.io/projected/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-kube-api-access-27ch5\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720602 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720619 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720645 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6xr8\" (UniqueName: \"kubernetes.io/projected/07dd67bc-303c-4a2a-bc68-e430cc5e63c2-kube-api-access-q6xr8\") pod \"dns-operator-744455d44c-xcssh\" (UID: \"07dd67bc-303c-4a2a-bc68-e430cc5e63c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720663 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-config\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720679 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdr6\" (UniqueName: \"kubernetes.io/projected/6edf078f-96f5-46e0-be5c-012e4799f320-kube-api-access-qrdr6\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720700 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57faa01e-5137-4dde-9102-e80bb5891cc5-serving-cert\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720719 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-audit-policies\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720735 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720750 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-service-ca\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720779 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-dir\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720796 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720812 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2jw\" (UniqueName: \"kubernetes.io/projected/fb2aac2a-eb0d-423b-a574-550cfb36adca-kube-api-access-jd2jw\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720828 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d96fbf2b-aad8-46ca-88a1-df1e9624f0ed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zvfd4\" (UID: \"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720848 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd4zw\" (UniqueName: \"kubernetes.io/projected/d5ca9be8-efe6-40e8-9f22-75f3e1644622-kube-api-access-nd4zw\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720863 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579c8074-5c53-4e1f-a620-f04cbccf63aa-serving-cert\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720878 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-config\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720897 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720912 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807933d0-9a58-4191-9bde-74a00551f72e-serving-cert\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720929 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6edf078f-96f5-46e0-be5c-012e4799f320-auth-proxy-config\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720954 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720970 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.720994 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svvr2\" (UniqueName: \"kubernetes.io/projected/5172e7b5-3ef1-4f51-8874-8d4ac858284b-kube-api-access-svvr2\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721017 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2830b-0f4d-4b3b-89bc-6e589839077d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721043 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5xbw\" (UniqueName: \"kubernetes.io/projected/bd1c8146-fe00-4a53-a102-17cfc6ef045b-kube-api-access-d5xbw\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721066 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9vpd\" (UniqueName: \"kubernetes.io/projected/69136696-f636-4c23-b89a-bfbb2eba3a85-kube-api-access-h9vpd\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721089 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721107 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721124 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbdg\" (UniqueName: \"kubernetes.io/projected/807933d0-9a58-4191-9bde-74a00551f72e-kube-api-access-zwbdg\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721143 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721158 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-serving-cert\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721177 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5172e7b5-3ef1-4f51-8874-8d4ac858284b-config\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721193 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5784d1c-d283-408d-a435-66b7cda6ac32-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721210 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721394 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721882 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.721911 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-etcd-serving-ca\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.722529 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-client-ca\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.722967 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-policies\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.723686 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.724176 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.724481 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b5qm4"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.724569 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.724798 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.726037 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-audit\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.726237 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69136696-f636-4c23-b89a-bfbb2eba3a85-audit-dir\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.726569 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.727219 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.727632 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.728495 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.731322 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5172e7b5-3ef1-4f51-8874-8d4ac858284b-images\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.729740 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-config\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.730106 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-service-ca-bundle\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.730924 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-image-import-ca\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.731495 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-client-ca\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.731641 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.729231 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2aac2a-eb0d-423b-a574-550cfb36adca-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.732315 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-config\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.734673 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.734825 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.735104 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-serving-cert\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.735517 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-node-pullsecrets\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.735556 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-audit-dir\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.735815 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5172e7b5-3ef1-4f51-8874-8d4ac858284b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.735990 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.737879 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.738236 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.738862 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-config\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.739276 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.739620 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-etcd-client\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.739868 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69136696-f636-4c23-b89a-bfbb2eba3a85-audit-policies\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.740348 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5172e7b5-3ef1-4f51-8874-8d4ac858284b-config\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.740693 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.740848 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5784d1c-d283-408d-a435-66b7cda6ac32-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.740991 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-dir\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.741381 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d96fbf2b-aad8-46ca-88a1-df1e9624f0ed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zvfd4\" (UID: \"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.741613 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.741694 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.741847 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-serving-cert\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.742161 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807933d0-9a58-4191-9bde-74a00551f72e-serving-cert\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.742238 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2aac2a-eb0d-423b-a574-550cfb36adca-serving-cert\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.743142 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-config\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.744202 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.746194 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.746277 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.747503 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.747797 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ca9be8-efe6-40e8-9f22-75f3e1644622-serving-cert\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.748402 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.748785 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.749064 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.749293 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-etcd-client\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.749597 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-encryption-config\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.749621 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.749639 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7pwvn"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.749781 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69136696-f636-4c23-b89a-bfbb2eba3a85-encryption-config\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.750060 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5784d1c-d283-408d-a435-66b7cda6ac32-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.750255 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.750559 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.752467 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.753111 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.755597 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cn5r8"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.756206 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.756371 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.756543 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.759582 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.767129 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qk4fp"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.767172 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r6nq5"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.772566 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.773561 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.774416 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.775444 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rsbwp"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.767456 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.775803 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.775986 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.776027 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.776248 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.783688 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.783868 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.785090 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.787839 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.791017 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v42j9"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.791860 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.792642 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-n2kcb"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.794859 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bkhfv"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.796047 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.797584 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.799394 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bt865"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.800169 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bt865" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.800670 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.802518 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mwgkm"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.804132 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.805211 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.806462 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.806625 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.807783 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.808687 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bldn4"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.809953 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rkv4v"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.810920 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hn2p2"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.811887 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.812911 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.813860 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gspst"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.814839 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.817024 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cn5r8"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.818230 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9dgw6"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.818756 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.819065 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sk5ff"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.819686 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.820102 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xcssh"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821173 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821652 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821682 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821707 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6xr8\" (UniqueName: \"kubernetes.io/projected/07dd67bc-303c-4a2a-bc68-e430cc5e63c2-kube-api-access-q6xr8\") pod \"dns-operator-744455d44c-xcssh\" (UID: \"07dd67bc-303c-4a2a-bc68-e430cc5e63c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821723 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdr6\" (UniqueName: \"kubernetes.io/projected/6edf078f-96f5-46e0-be5c-012e4799f320-kube-api-access-qrdr6\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821743 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57faa01e-5137-4dde-9102-e80bb5891cc5-serving-cert\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821767 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-service-ca\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821789 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579c8074-5c53-4e1f-a620-f04cbccf63aa-serving-cert\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821814 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-config\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821830 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6edf078f-96f5-46e0-be5c-012e4799f320-auth-proxy-config\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821858 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2830b-0f4d-4b3b-89bc-6e589839077d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821891 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-serving-cert\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821907 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821923 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px67s\" (UniqueName: \"kubernetes.io/projected/579c8074-5c53-4e1f-a620-f04cbccf63aa-kube-api-access-px67s\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821944 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edf078f-96f5-46e0-be5c-012e4799f320-config\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821959 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-ca\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821974 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxz2\" (UniqueName: \"kubernetes.io/projected/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-kube-api-access-5bxz2\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.821990 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-trusted-ca-bundle\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822007 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-oauth-serving-cert\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822022 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7t2\" (UniqueName: \"kubernetes.io/projected/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-kube-api-access-lb7t2\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822040 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579c8074-5c53-4e1f-a620-f04cbccf63aa-config\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822054 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-client\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822075 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-service-ca\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822095 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pvkj\" (UniqueName: \"kubernetes.io/projected/67be2c46-f396-48d1-ba5e-d21f8362a4dc-kube-api-access-5pvkj\") pod \"downloads-7954f5f757-mwgkm\" (UID: \"67be2c46-f396-48d1-ba5e-d21f8362a4dc\") " pod="openshift-console/downloads-7954f5f757-mwgkm" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822114 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dd67bc-303c-4a2a-bc68-e430cc5e63c2-metrics-tls\") pod \"dns-operator-744455d44c-xcssh\" (UID: \"07dd67bc-303c-4a2a-bc68-e430cc5e63c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822130 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nx9t\" (UniqueName: \"kubernetes.io/projected/57faa01e-5137-4dde-9102-e80bb5891cc5-kube-api-access-8nx9t\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822164 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579c8074-5c53-4e1f-a620-f04cbccf63aa-trusted-ca\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822179 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvl7\" (UniqueName: \"kubernetes.io/projected/7fc2830b-0f4d-4b3b-89bc-6e589839077d-kube-api-access-4vvl7\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822193 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-oauth-config\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822211 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6edf078f-96f5-46e0-be5c-012e4799f320-machine-approver-tls\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822229 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-config\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.822247 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc2830b-0f4d-4b3b-89bc-6e589839077d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.823140 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.823450 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2830b-0f4d-4b3b-89bc-6e589839077d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.823919 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-trusted-ca-bundle\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.824018 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-config\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.824493 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579c8074-5c53-4e1f-a620-f04cbccf63aa-trusted-ca\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.824499 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-oauth-serving-cert\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.825035 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6edf078f-96f5-46e0-be5c-012e4799f320-auth-proxy-config\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.825129 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.825159 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.825206 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.825226 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.825463 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579c8074-5c53-4e1f-a620-f04cbccf63aa-serving-cert\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.825619 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc2830b-0f4d-4b3b-89bc-6e589839077d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.826129 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edf078f-96f5-46e0-be5c-012e4799f320-config\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.827083 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.827088 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579c8074-5c53-4e1f-a620-f04cbccf63aa-config\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.827115 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-76928"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.827320 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-service-ca\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.828065 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sk5ff"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.828322 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-serving-cert\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.828948 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6edf078f-96f5-46e0-be5c-012e4799f320-machine-approver-tls\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.829276 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-console-oauth-config\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.829322 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.830488 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.831259 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57faa01e-5137-4dde-9102-e80bb5891cc5-serving-cert\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.831557 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.832773 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v42j9"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.833787 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.834863 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.835889 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bt865"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.836560 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-config\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.839245 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.839278 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r6nq5"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.839290 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rsbwp"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.841568 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpdhg"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.842610 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpdhg"] Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.842689 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.845840 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-client\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.846877 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.859830 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-ca\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.866908 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.876644 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/57faa01e-5137-4dde-9102-e80bb5891cc5-etcd-service-ca\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.886850 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.907083 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.926362 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.946893 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.957855 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dd67bc-303c-4a2a-bc68-e430cc5e63c2-metrics-tls\") pod \"dns-operator-744455d44c-xcssh\" (UID: \"07dd67bc-303c-4a2a-bc68-e430cc5e63c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:28 crc kubenswrapper[4677]: I1007 13:09:28.967535 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.027093 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.047906 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.067348 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.097897 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.107808 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.127548 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.148070 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.167991 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.188517 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.207062 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.226776 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.247169 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.267825 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.286888 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.307418 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.328147 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.346788 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.367771 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.387523 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.407828 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.427576 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.466518 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27ch5\" (UniqueName: \"kubernetes.io/projected/a5919bd4-5d4e-4aa1-9b66-36460e7e24f3-kube-api-access-27ch5\") pod \"apiserver-76f77b778f-bkhfv\" (UID: \"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3\") " pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.496628 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd6kp\" (UniqueName: \"kubernetes.io/projected/d96fbf2b-aad8-46ca-88a1-df1e9624f0ed-kube-api-access-nd6kp\") pod \"cluster-samples-operator-665b6dd947-zvfd4\" (UID: \"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.507100 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.507456 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwmwr\" (UniqueName: \"kubernetes.io/projected/e5784d1c-d283-408d-a435-66b7cda6ac32-kube-api-access-dwmwr\") pod \"openshift-apiserver-operator-796bbdcf4f-bv6vv\" (UID: \"e5784d1c-d283-408d-a435-66b7cda6ac32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.527576 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.547126 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.560406 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.567880 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.618122 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbdg\" (UniqueName: \"kubernetes.io/projected/807933d0-9a58-4191-9bde-74a00551f72e-kube-api-access-zwbdg\") pod \"controller-manager-879f6c89f-qk4fp\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.626569 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.638051 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2jw\" (UniqueName: \"kubernetes.io/projected/fb2aac2a-eb0d-423b-a574-550cfb36adca-kube-api-access-jd2jw\") pod \"authentication-operator-69f744f599-n2kcb\" (UID: \"fb2aac2a-eb0d-423b-a574-550cfb36adca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.653881 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd4zw\" (UniqueName: \"kubernetes.io/projected/d5ca9be8-efe6-40e8-9f22-75f3e1644622-kube-api-access-nd4zw\") pod \"route-controller-manager-6576b87f9c-rfpfx\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.679694 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svvr2\" (UniqueName: \"kubernetes.io/projected/5172e7b5-3ef1-4f51-8874-8d4ac858284b-kube-api-access-svvr2\") pod \"machine-api-operator-5694c8668f-b5qm4\" (UID: \"5172e7b5-3ef1-4f51-8874-8d4ac858284b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.699682 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.702915 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5xbw\" (UniqueName: \"kubernetes.io/projected/bd1c8146-fe00-4a53-a102-17cfc6ef045b-kube-api-access-d5xbw\") pod \"oauth-openshift-558db77b4-gspst\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.708013 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.709045 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9vpd\" (UniqueName: \"kubernetes.io/projected/69136696-f636-4c23-b89a-bfbb2eba3a85-kube-api-access-h9vpd\") pod \"apiserver-7bbb656c7d-5x69q\" (UID: \"69136696-f636-4c23-b89a-bfbb2eba3a85\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.733699 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.747744 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.764268 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.766590 4677 request.go:700] Waited for 1.017253469s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.771063 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.787697 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.797640 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.807339 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.826835 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bkhfv"] Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.828922 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.837853 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.847318 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.868013 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.887303 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.902092 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.905706 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.910248 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.928966 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.947586 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.967545 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.986185 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:29 crc kubenswrapper[4677]: I1007 13:09:29.988759 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.005536 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qk4fp"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.007003 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: W1007 13:09:30.022404 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807933d0_9a58_4191_9bde_74a00551f72e.slice/crio-2a7b3dc8f957709c071215c44cf39ae96c90fa479e1853ec57829df5e0cbd65b WatchSource:0}: Error finding container 2a7b3dc8f957709c071215c44cf39ae96c90fa479e1853ec57829df5e0cbd65b: Status 404 returned error can't find the container with id 2a7b3dc8f957709c071215c44cf39ae96c90fa479e1853ec57829df5e0cbd65b Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.027580 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.041020 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b5qm4"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.047345 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.066161 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.071360 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" event={"ID":"807933d0-9a58-4191-9bde-74a00551f72e","Type":"ContainerStarted","Data":"2a7b3dc8f957709c071215c44cf39ae96c90fa479e1853ec57829df5e0cbd65b"} Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.077550 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.077772 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" event={"ID":"5172e7b5-3ef1-4f51-8874-8d4ac858284b","Type":"ContainerStarted","Data":"bf9dd6e19cc20fddf08e9d60c71d5ffe5d510ac5e9b91727f0fe28e2a15bbeb3"} Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.082373 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" event={"ID":"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3","Type":"ContainerStarted","Data":"c4ecfdc219c5c507795613f44034cade39316a7e50ba21273a086f1ffba85913"} Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.088717 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.091422 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.107371 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 13:09:30 crc kubenswrapper[4677]: W1007 13:09:30.114647 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5ca9be8_efe6_40e8_9f22_75f3e1644622.slice/crio-2a6f7fafebf1b1b972b32ee45cc3cfd824d9f284e729b99b23b492b882ba639b WatchSource:0}: Error finding container 2a6f7fafebf1b1b972b32ee45cc3cfd824d9f284e729b99b23b492b882ba639b: Status 404 returned error can't find the container with id 2a6f7fafebf1b1b972b32ee45cc3cfd824d9f284e729b99b23b492b882ba639b Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.126911 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.147016 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.154115 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.171029 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.192178 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.207865 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.226575 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.234733 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gspst"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.247023 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.266460 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 13:09:30 crc kubenswrapper[4677]: E1007 13:09:30.268863 4677 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5919bd4_5d4e_4aa1_9b66_36460e7e24f3.slice/crio-ab854ec161379bbb515afda2d764134ef3d1ab657aa159f96d715e7085f83080.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5919bd4_5d4e_4aa1_9b66_36460e7e24f3.slice/crio-conmon-ab854ec161379bbb515afda2d764134ef3d1ab657aa159f96d715e7085f83080.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.286687 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.307472 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 13:09:30 crc kubenswrapper[4677]: W1007 13:09:30.319220 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1c8146_fe00_4a53_a102_17cfc6ef045b.slice/crio-7936f89d94c94cac5451e640bb52c560dfb7ad0a9e476277b82d7070d637ad6a WatchSource:0}: Error finding container 7936f89d94c94cac5451e640bb52c560dfb7ad0a9e476277b82d7070d637ad6a: Status 404 returned error can't find the container with id 7936f89d94c94cac5451e640bb52c560dfb7ad0a9e476277b82d7070d637ad6a Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.331223 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.347961 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.358317 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-n2kcb"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.364498 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q"] Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.368884 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:09:30 crc kubenswrapper[4677]: W1007 13:09:30.373475 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2aac2a_eb0d_423b_a574_550cfb36adca.slice/crio-621e705864d079689940854cb69a2d92cdb2f9559786bfb600495d502dff81d4 WatchSource:0}: Error finding container 621e705864d079689940854cb69a2d92cdb2f9559786bfb600495d502dff81d4: Status 404 returned error can't find the container with id 621e705864d079689940854cb69a2d92cdb2f9559786bfb600495d502dff81d4 Oct 07 13:09:30 crc kubenswrapper[4677]: W1007 13:09:30.376026 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69136696_f636_4c23_b89a_bfbb2eba3a85.slice/crio-6c4209918b6aadebbfba399afec4dc94fbb5b79bae5b5ddc9b79804830bb03dc WatchSource:0}: Error finding container 6c4209918b6aadebbfba399afec4dc94fbb5b79bae5b5ddc9b79804830bb03dc: Status 404 returned error can't find the container with id 6c4209918b6aadebbfba399afec4dc94fbb5b79bae5b5ddc9b79804830bb03dc Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.387861 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.409556 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.428300 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.456180 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.467012 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.487376 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.507088 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.527237 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.547825 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.567407 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.587031 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.607631 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.627383 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.647158 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.666669 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.688250 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.707668 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.749552 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6xr8\" (UniqueName: \"kubernetes.io/projected/07dd67bc-303c-4a2a-bc68-e430cc5e63c2-kube-api-access-q6xr8\") pod \"dns-operator-744455d44c-xcssh\" (UID: \"07dd67bc-303c-4a2a-bc68-e430cc5e63c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.761284 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdr6\" (UniqueName: \"kubernetes.io/projected/6edf078f-96f5-46e0-be5c-012e4799f320-kube-api-access-qrdr6\") pod \"machine-approver-56656f9798-psk9k\" (UID: \"6edf078f-96f5-46e0-be5c-012e4799f320\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.782128 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pvkj\" (UniqueName: \"kubernetes.io/projected/67be2c46-f396-48d1-ba5e-d21f8362a4dc-kube-api-access-5pvkj\") pod \"downloads-7954f5f757-mwgkm\" (UID: \"67be2c46-f396-48d1-ba5e-d21f8362a4dc\") " pod="openshift-console/downloads-7954f5f757-mwgkm" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.786109 4677 request.go:700] Waited for 1.963255653s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.803708 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxz2\" (UniqueName: \"kubernetes.io/projected/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-kube-api-access-5bxz2\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.822341 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvl7\" (UniqueName: \"kubernetes.io/projected/7fc2830b-0f4d-4b3b-89bc-6e589839077d-kube-api-access-4vvl7\") pod \"openshift-controller-manager-operator-756b6f6bc6-wt49f\" (UID: \"7fc2830b-0f4d-4b3b-89bc-6e589839077d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.838722 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nx9t\" (UniqueName: \"kubernetes.io/projected/57faa01e-5137-4dde-9102-e80bb5891cc5-kube-api-access-8nx9t\") pod \"etcd-operator-b45778765-rkv4v\" (UID: \"57faa01e-5137-4dde-9102-e80bb5891cc5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.860172 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7t2\" (UniqueName: \"kubernetes.io/projected/364ed7ee-3c5a-4d7f-ba97-ddd52483de83-kube-api-access-lb7t2\") pod \"console-f9d7485db-76928\" (UID: \"364ed7ee-3c5a-4d7f-ba97-ddd52483de83\") " pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.882563 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b29lb\" (UID: \"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.907746 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px67s\" (UniqueName: \"kubernetes.io/projected/579c8074-5c53-4e1f-a620-f04cbccf63aa-kube-api-access-px67s\") pod \"console-operator-58897d9998-hn2p2\" (UID: \"579c8074-5c53-4e1f-a620-f04cbccf63aa\") " pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.908377 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.916855 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.926936 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mwgkm" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.929468 4677 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.947933 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.965998 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.974521 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.981723 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" Oct 07 13:09:30 crc kubenswrapper[4677]: I1007 13:09:30.994567 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.000895 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" Oct 07 13:09:31 crc kubenswrapper[4677]: W1007 13:09:31.003961 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6edf078f_96f5_46e0_be5c_012e4799f320.slice/crio-545c49ed57b2c63b319287c8aee3257550d031eec4b961563a542daf2378b5d9 WatchSource:0}: Error finding container 545c49ed57b2c63b319287c8aee3257550d031eec4b961563a542daf2378b5d9: Status 404 returned error can't find the container with id 545c49ed57b2c63b319287c8aee3257550d031eec4b961563a542daf2378b5d9 Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.009486 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056408 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51dd4275-14c4-459b-a065-46ae2b4fd741-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056741 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vbb\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-kube-api-access-t9vbb\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056775 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-certificates\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056800 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6f7\" (UniqueName: \"kubernetes.io/projected/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-kube-api-access-4d6f7\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056831 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056848 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-trusted-ca\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056875 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12fff5e2-3fc4-4418-b0b8-04929a968823-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056905 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-bound-sa-token\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056926 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-metrics-tls\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056953 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51dd4275-14c4-459b-a065-46ae2b4fd741-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.056971 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.057004 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf8n4\" (UniqueName: \"kubernetes.io/projected/12fff5e2-3fc4-4418-b0b8-04929a968823-kube-api-access-jf8n4\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.057038 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/12fff5e2-3fc4-4418-b0b8-04929a968823-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.057068 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-tls\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.057088 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-trusted-ca\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.058124 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:31.558109336 +0000 UTC m=+143.043818451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.093937 4677 generic.go:334] "Generic (PLEG): container finished" podID="69136696-f636-4c23-b89a-bfbb2eba3a85" containerID="0cc04e0f09f3b01f5ccb7058c079e33ac767111d45f8dd07a93890bd5592bc02" exitCode=0 Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.094013 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" event={"ID":"69136696-f636-4c23-b89a-bfbb2eba3a85","Type":"ContainerDied","Data":"0cc04e0f09f3b01f5ccb7058c079e33ac767111d45f8dd07a93890bd5592bc02"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.094056 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" event={"ID":"69136696-f636-4c23-b89a-bfbb2eba3a85","Type":"ContainerStarted","Data":"6c4209918b6aadebbfba399afec4dc94fbb5b79bae5b5ddc9b79804830bb03dc"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.097188 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" event={"ID":"e5784d1c-d283-408d-a435-66b7cda6ac32","Type":"ContainerStarted","Data":"e40b503c97567469891fad99864cd6d6ead92f36341053b9b651a92645ab8aca"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.097245 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" event={"ID":"e5784d1c-d283-408d-a435-66b7cda6ac32","Type":"ContainerStarted","Data":"17ec27ca81584d61d9b179dda5b8b701d5177a789b59bcbc4aad7392eb01d82a"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.099753 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" event={"ID":"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed","Type":"ContainerStarted","Data":"8d9b84d2b6b6987ff0884fc11281689af212c6a3f8eaf455d6a0961b200bacfb"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.099774 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" event={"ID":"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed","Type":"ContainerStarted","Data":"f4d6a70290217e8624283adc83f187168dd4b52de286e79cb56504da6ab22222"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.099784 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" event={"ID":"d96fbf2b-aad8-46ca-88a1-df1e9624f0ed","Type":"ContainerStarted","Data":"b9e08f36c3a45e8e498bed1ea0ad850e3eeb74ad8feee2c00de7476d3ec4fd1f"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.101978 4677 generic.go:334] "Generic (PLEG): container finished" podID="a5919bd4-5d4e-4aa1-9b66-36460e7e24f3" containerID="ab854ec161379bbb515afda2d764134ef3d1ab657aa159f96d715e7085f83080" exitCode=0 Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.102022 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" event={"ID":"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3","Type":"ContainerDied","Data":"ab854ec161379bbb515afda2d764134ef3d1ab657aa159f96d715e7085f83080"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.103182 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" event={"ID":"fb2aac2a-eb0d-423b-a574-550cfb36adca","Type":"ContainerStarted","Data":"3b8d908c00fc996f14e5ca13582c89cffe7ae9a27ba85543175d7b69cc855347"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.103200 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" event={"ID":"fb2aac2a-eb0d-423b-a574-550cfb36adca","Type":"ContainerStarted","Data":"621e705864d079689940854cb69a2d92cdb2f9559786bfb600495d502dff81d4"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.105503 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" event={"ID":"807933d0-9a58-4191-9bde-74a00551f72e","Type":"ContainerStarted","Data":"dfd09b2306f81fe19e582e11c9c4b3ce378d6088e025db2fee88df38a1cc3eca"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.105657 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.109340 4677 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-qk4fp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.109424 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" podUID="807933d0-9a58-4191-9bde-74a00551f72e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.168373 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.168783 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51dd4275-14c4-459b-a065-46ae2b4fd741-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.168818 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fcb258-1307-45d4-bbff-c55e81ab1df3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.168855 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9834654-a39c-4991-bd6b-db7c9e37b2d3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.168876 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9834654-a39c-4991-bd6b-db7c9e37b2d3-config\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.168922 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.168980 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/12fff5e2-3fc4-4418-b0b8-04929a968823-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169003 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps9jl\" (UniqueName: \"kubernetes.io/projected/669c03f6-55a4-4fee-b442-99cf9863678e-kube-api-access-ps9jl\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169064 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b84820c2-fd0b-4e52-801c-a70286d639de-config-volume\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169087 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn6pm\" (UniqueName: \"kubernetes.io/projected/9122c8d7-acc8-4ed0-81b0-79ea36536943-kube-api-access-xn6pm\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169114 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-tls\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169163 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61a4ee31-59ca-4626-b1de-4d70fb7d8789-certs\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169190 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45ppz\" (UniqueName: \"kubernetes.io/projected/f3e33968-c99b-483d-8a53-e4d92fba4e12-kube-api-access-45ppz\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169269 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-default-certificate\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169294 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/214bd9b5-75be-4fc3-b973-1ca9d77431bf-signing-cabundle\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169337 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/214bd9b5-75be-4fc3-b973-1ca9d77431bf-signing-key\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169369 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9dhb\" (UniqueName: \"kubernetes.io/projected/1548b62c-1671-430a-9286-a999460ae8d3-kube-api-access-c9dhb\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169397 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cn5r8\" (UID: \"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169466 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51dd4275-14c4-459b-a065-46ae2b4fd741-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169493 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-proxy-tls\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169535 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6z9\" (UniqueName: \"kubernetes.io/projected/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-kube-api-access-9f6z9\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169558 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl87m\" (UniqueName: \"kubernetes.io/projected/214bd9b5-75be-4fc3-b973-1ca9d77431bf-kube-api-access-rl87m\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169579 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg9m9\" (UniqueName: \"kubernetes.io/projected/5a685905-0b88-4b62-977b-a84e13aa85f7-kube-api-access-bg9m9\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169604 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-plugins-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169645 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vbb\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-kube-api-access-t9vbb\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169668 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3e33968-c99b-483d-8a53-e4d92fba4e12-srv-cert\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169686 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3e33968-c99b-483d-8a53-e4d92fba4e12-profile-collector-cert\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169712 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169846 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169871 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-srv-cert\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169908 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-proxy-tls\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169935 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61a4ee31-59ca-4626-b1de-4d70fb7d8789-node-bootstrap-token\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.169960 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzv5\" (UniqueName: \"kubernetes.io/projected/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-kube-api-access-rdzv5\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170017 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6f7\" (UniqueName: \"kubernetes.io/projected/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-kube-api-access-4d6f7\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170043 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a01f67f-7cce-4bdc-8c89-10c4ac505f20-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fd92z\" (UID: \"3a01f67f-7cce-4bdc-8c89-10c4ac505f20\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170074 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/669c03f6-55a4-4fee-b442-99cf9863678e-webhook-cert\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170110 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-trusted-ca\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170129 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fef6870-fac6-49bc-8471-fb78198ba057-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wklrf\" (UID: \"2fef6870-fac6-49bc-8471-fb78198ba057\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170517 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a685905-0b88-4b62-977b-a84e13aa85f7-serving-cert\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170620 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-bound-sa-token\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170638 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-metrics-tls\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170726 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170801 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf8n4\" (UniqueName: \"kubernetes.io/projected/12fff5e2-3fc4-4418-b0b8-04929a968823-kube-api-access-jf8n4\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170834 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72fcb258-1307-45d4-bbff-c55e81ab1df3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170857 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fcb258-1307-45d4-bbff-c55e81ab1df3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170884 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-metrics-certs\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.170983 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-trusted-ca\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171011 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7967c1f0-b5b5-4640-b3f0-38588f8af13c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171038 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171067 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/669c03f6-55a4-4fee-b442-99cf9863678e-tmpfs\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171097 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0a169c9-3504-4d10-a04f-0c9223b5acca-config-volume\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171174 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlrmg\" (UniqueName: \"kubernetes.io/projected/4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8-kube-api-access-qlrmg\") pod \"multus-admission-controller-857f4d67dd-cn5r8\" (UID: \"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171202 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9834654-a39c-4991-bd6b-db7c9e37b2d3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171237 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a685905-0b88-4b62-977b-a84e13aa85f7-config\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171274 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2fjn\" (UniqueName: \"kubernetes.io/projected/74a859c4-374b-4fc4-89c6-e2ae649fe43f-kube-api-access-w2fjn\") pod \"ingress-canary-sk5ff\" (UID: \"74a859c4-374b-4fc4-89c6-e2ae649fe43f\") " pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171292 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7k5\" (UniqueName: \"kubernetes.io/projected/3a01f67f-7cce-4bdc-8c89-10c4ac505f20-kube-api-access-zm7k5\") pod \"package-server-manager-789f6589d5-fd92z\" (UID: \"3a01f67f-7cce-4bdc-8c89-10c4ac505f20\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171341 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-socket-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171362 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljbf\" (UniqueName: \"kubernetes.io/projected/b84820c2-fd0b-4e52-801c-a70286d639de-kube-api-access-jljbf\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171379 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171399 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171417 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-mountpoint-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171654 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnqg9\" (UniqueName: \"kubernetes.io/projected/6773777c-949c-46da-95b7-c6008e52b396-kube-api-access-bnqg9\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171676 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171697 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b84820c2-fd0b-4e52-801c-a70286d639de-secret-volume\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171745 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ptn\" (UniqueName: \"kubernetes.io/projected/2fef6870-fac6-49bc-8471-fb78198ba057-kube-api-access-48ptn\") pod \"control-plane-machine-set-operator-78cbb6b69f-wklrf\" (UID: \"2fef6870-fac6-49bc-8471-fb78198ba057\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171764 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-images\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171799 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-registration-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171838 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-certificates\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171858 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0a169c9-3504-4d10-a04f-0c9223b5acca-metrics-tls\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171880 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a859c4-374b-4fc4-89c6-e2ae649fe43f-cert\") pod \"ingress-canary-sk5ff\" (UID: \"74a859c4-374b-4fc4-89c6-e2ae649fe43f\") " pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171933 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7967c1f0-b5b5-4640-b3f0-38588f8af13c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171954 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7224h\" (UniqueName: \"kubernetes.io/projected/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-kube-api-access-7224h\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.171975 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/669c03f6-55a4-4fee-b442-99cf9863678e-apiservice-cert\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172025 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-csi-data-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172063 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-config\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172082 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12fff5e2-3fc4-4418-b0b8-04929a968823-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172104 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzhv8\" (UniqueName: \"kubernetes.io/projected/f0a169c9-3504-4d10-a04f-0c9223b5acca-kube-api-access-pzhv8\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172125 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-stats-auth\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172144 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94kw6\" (UniqueName: \"kubernetes.io/projected/61a4ee31-59ca-4626-b1de-4d70fb7d8789-kube-api-access-94kw6\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172163 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6773777c-949c-46da-95b7-c6008e52b396-service-ca-bundle\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172182 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxh9\" (UniqueName: \"kubernetes.io/projected/56785c5a-ebb4-4921-ae74-5239c1e09cf5-kube-api-access-rtxh9\") pod \"migrator-59844c95c7-qgcjl\" (UID: \"56785c5a-ebb4-4921-ae74-5239c1e09cf5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.172220 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp28n\" (UniqueName: \"kubernetes.io/projected/7967c1f0-b5b5-4640-b3f0-38588f8af13c-kube-api-access-sp28n\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.191702 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51dd4275-14c4-459b-a065-46ae2b4fd741-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.193257 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-trusted-ca\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.192146 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51dd4275-14c4-459b-a065-46ae2b4fd741-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.193770 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-certificates\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.195323 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/12fff5e2-3fc4-4418-b0b8-04929a968823-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.203030 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" event={"ID":"bd1c8146-fe00-4a53-a102-17cfc6ef045b","Type":"ContainerStarted","Data":"da5f756ed2ce40fda4eaff3fc908b803777660990da37cf7bb6c8460c2373e85"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.203089 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.203104 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" event={"ID":"bd1c8146-fe00-4a53-a102-17cfc6ef045b","Type":"ContainerStarted","Data":"7936f89d94c94cac5451e640bb52c560dfb7ad0a9e476277b82d7070d637ad6a"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.203119 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.203509 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-trusted-ca\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.204212 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:31.704188715 +0000 UTC m=+143.189897860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.204376 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" event={"ID":"d5ca9be8-efe6-40e8-9f22-75f3e1644622","Type":"ContainerStarted","Data":"0d554f3163dd5bc26cd3eb6d35f855cb3865e8157411ab777e4aefe05d75b98a"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.204408 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" event={"ID":"d5ca9be8-efe6-40e8-9f22-75f3e1644622","Type":"ContainerStarted","Data":"2a6f7fafebf1b1b972b32ee45cc3cfd824d9f284e729b99b23b492b882ba639b"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.204927 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.204997 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-tls\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.210920 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12fff5e2-3fc4-4418-b0b8-04929a968823-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.214988 4677 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gspst container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.215043 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" podUID="bd1c8146-fe00-4a53-a102-17cfc6ef045b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.215684 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-metrics-tls\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.223990 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" event={"ID":"5172e7b5-3ef1-4f51-8874-8d4ac858284b","Type":"ContainerStarted","Data":"ebfd1298743890d99e47fe77d5d2abbd4d1a36ae0bd3b481dceb6997c6813292"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.224258 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" event={"ID":"5172e7b5-3ef1-4f51-8874-8d4ac858284b","Type":"ContainerStarted","Data":"054c11bb73c7c5320dddcdc374cb7de0fd6c3013bd161f467addc49d1a332ec9"} Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.235149 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" event={"ID":"6edf078f-96f5-46e0-be5c-012e4799f320","Type":"ContainerStarted","Data":"545c49ed57b2c63b319287c8aee3257550d031eec4b961563a542daf2378b5d9"} Oct 07 13:09:31 crc kubenswrapper[4677]: W1007 13:09:31.235179 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fc2830b_0f4d_4b3b_89bc_6e589839077d.slice/crio-ac4dff86cb5d33059b21e43bfa77aef61b6f4cdf260c0987f883fed39d07bb83 WatchSource:0}: Error finding container ac4dff86cb5d33059b21e43bfa77aef61b6f4cdf260c0987f883fed39d07bb83: Status 404 returned error can't find the container with id ac4dff86cb5d33059b21e43bfa77aef61b6f4cdf260c0987f883fed39d07bb83 Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.239953 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-bound-sa-token\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.244583 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.272101 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf8n4\" (UniqueName: \"kubernetes.io/projected/12fff5e2-3fc4-4418-b0b8-04929a968823-kube-api-access-jf8n4\") pod \"openshift-config-operator-7777fb866f-nhjp9\" (UID: \"12fff5e2-3fc4-4418-b0b8-04929a968823\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.272964 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3e33968-c99b-483d-8a53-e4d92fba4e12-srv-cert\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.272994 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3e33968-c99b-483d-8a53-e4d92fba4e12-profile-collector-cert\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273011 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl87m\" (UniqueName: \"kubernetes.io/projected/214bd9b5-75be-4fc3-b973-1ca9d77431bf-kube-api-access-rl87m\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273031 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg9m9\" (UniqueName: \"kubernetes.io/projected/5a685905-0b88-4b62-977b-a84e13aa85f7-kube-api-access-bg9m9\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273051 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-plugins-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273070 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273128 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273142 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-srv-cert\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273194 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-proxy-tls\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273212 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61a4ee31-59ca-4626-b1de-4d70fb7d8789-node-bootstrap-token\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273227 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzv5\" (UniqueName: \"kubernetes.io/projected/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-kube-api-access-rdzv5\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273307 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a01f67f-7cce-4bdc-8c89-10c4ac505f20-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fd92z\" (UID: \"3a01f67f-7cce-4bdc-8c89-10c4ac505f20\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273326 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/669c03f6-55a4-4fee-b442-99cf9863678e-webhook-cert\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273345 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273364 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fef6870-fac6-49bc-8471-fb78198ba057-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wklrf\" (UID: \"2fef6870-fac6-49bc-8471-fb78198ba057\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273394 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a685905-0b88-4b62-977b-a84e13aa85f7-serving-cert\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273468 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72fcb258-1307-45d4-bbff-c55e81ab1df3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273489 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fcb258-1307-45d4-bbff-c55e81ab1df3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273515 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-metrics-certs\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273555 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273576 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/669c03f6-55a4-4fee-b442-99cf9863678e-tmpfs\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273592 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7967c1f0-b5b5-4640-b3f0-38588f8af13c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273605 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0a169c9-3504-4d10-a04f-0c9223b5acca-config-volume\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273622 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlrmg\" (UniqueName: \"kubernetes.io/projected/4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8-kube-api-access-qlrmg\") pod \"multus-admission-controller-857f4d67dd-cn5r8\" (UID: \"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273637 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9834654-a39c-4991-bd6b-db7c9e37b2d3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273659 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2fjn\" (UniqueName: \"kubernetes.io/projected/74a859c4-374b-4fc4-89c6-e2ae649fe43f-kube-api-access-w2fjn\") pod \"ingress-canary-sk5ff\" (UID: \"74a859c4-374b-4fc4-89c6-e2ae649fe43f\") " pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273674 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7k5\" (UniqueName: \"kubernetes.io/projected/3a01f67f-7cce-4bdc-8c89-10c4ac505f20-kube-api-access-zm7k5\") pod \"package-server-manager-789f6589d5-fd92z\" (UID: \"3a01f67f-7cce-4bdc-8c89-10c4ac505f20\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273688 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a685905-0b88-4b62-977b-a84e13aa85f7-config\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273711 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-socket-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273727 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jljbf\" (UniqueName: \"kubernetes.io/projected/b84820c2-fd0b-4e52-801c-a70286d639de-kube-api-access-jljbf\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273743 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-mountpoint-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273757 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273777 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273792 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnqg9\" (UniqueName: \"kubernetes.io/projected/6773777c-949c-46da-95b7-c6008e52b396-kube-api-access-bnqg9\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273808 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ptn\" (UniqueName: \"kubernetes.io/projected/2fef6870-fac6-49bc-8471-fb78198ba057-kube-api-access-48ptn\") pod \"control-plane-machine-set-operator-78cbb6b69f-wklrf\" (UID: \"2fef6870-fac6-49bc-8471-fb78198ba057\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273822 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-images\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273836 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273852 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b84820c2-fd0b-4e52-801c-a70286d639de-secret-volume\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273875 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-registration-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273902 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7967c1f0-b5b5-4640-b3f0-38588f8af13c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273918 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7224h\" (UniqueName: \"kubernetes.io/projected/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-kube-api-access-7224h\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273932 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0a169c9-3504-4d10-a04f-0c9223b5acca-metrics-tls\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273946 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a859c4-374b-4fc4-89c6-e2ae649fe43f-cert\") pod \"ingress-canary-sk5ff\" (UID: \"74a859c4-374b-4fc4-89c6-e2ae649fe43f\") " pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.273968 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/669c03f6-55a4-4fee-b442-99cf9863678e-apiservice-cert\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274010 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-csi-data-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274026 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-config\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274042 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-stats-auth\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274057 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzhv8\" (UniqueName: \"kubernetes.io/projected/f0a169c9-3504-4d10-a04f-0c9223b5acca-kube-api-access-pzhv8\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274095 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94kw6\" (UniqueName: \"kubernetes.io/projected/61a4ee31-59ca-4626-b1de-4d70fb7d8789-kube-api-access-94kw6\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274140 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6773777c-949c-46da-95b7-c6008e52b396-service-ca-bundle\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274170 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp28n\" (UniqueName: \"kubernetes.io/projected/7967c1f0-b5b5-4640-b3f0-38588f8af13c-kube-api-access-sp28n\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274192 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtxh9\" (UniqueName: \"kubernetes.io/projected/56785c5a-ebb4-4921-ae74-5239c1e09cf5-kube-api-access-rtxh9\") pod \"migrator-59844c95c7-qgcjl\" (UID: \"56785c5a-ebb4-4921-ae74-5239c1e09cf5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274231 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fcb258-1307-45d4-bbff-c55e81ab1df3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274252 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9834654-a39c-4991-bd6b-db7c9e37b2d3-config\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274272 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9834654-a39c-4991-bd6b-db7c9e37b2d3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274290 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274315 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps9jl\" (UniqueName: \"kubernetes.io/projected/669c03f6-55a4-4fee-b442-99cf9863678e-kube-api-access-ps9jl\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274369 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b84820c2-fd0b-4e52-801c-a70286d639de-config-volume\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274388 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn6pm\" (UniqueName: \"kubernetes.io/projected/9122c8d7-acc8-4ed0-81b0-79ea36536943-kube-api-access-xn6pm\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274413 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61a4ee31-59ca-4626-b1de-4d70fb7d8789-certs\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274502 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45ppz\" (UniqueName: \"kubernetes.io/projected/f3e33968-c99b-483d-8a53-e4d92fba4e12-kube-api-access-45ppz\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274521 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-default-certificate\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274537 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/214bd9b5-75be-4fc3-b973-1ca9d77431bf-signing-key\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274553 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/214bd9b5-75be-4fc3-b973-1ca9d77431bf-signing-cabundle\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274578 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9dhb\" (UniqueName: \"kubernetes.io/projected/1548b62c-1671-430a-9286-a999460ae8d3-kube-api-access-c9dhb\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274595 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cn5r8\" (UID: \"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274619 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6z9\" (UniqueName: \"kubernetes.io/projected/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-kube-api-access-9f6z9\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.274636 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-proxy-tls\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.283173 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a685905-0b88-4b62-977b-a84e13aa85f7-serving-cert\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.284655 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-csi-data-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.285678 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-plugins-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.285714 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-config\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.290509 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.290653 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vbb\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-kube-api-access-t9vbb\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.291278 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:31.791263889 +0000 UTC m=+143.276973004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.292171 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-metrics-certs\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.292754 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/669c03f6-55a4-4fee-b442-99cf9863678e-tmpfs\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.293586 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fcb258-1307-45d4-bbff-c55e81ab1df3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.293930 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.294650 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-mountpoint-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.295571 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7967c1f0-b5b5-4640-b3f0-38588f8af13c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.296216 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-images\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.296521 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/214bd9b5-75be-4fc3-b973-1ca9d77431bf-signing-cabundle\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.305039 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6773777c-949c-46da-95b7-c6008e52b396-service-ca-bundle\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.306508 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-registration-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.307149 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9834654-a39c-4991-bd6b-db7c9e37b2d3-config\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.310413 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1548b62c-1671-430a-9286-a999460ae8d3-socket-dir\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.311188 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-proxy-tls\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.313442 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.314083 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b84820c2-fd0b-4e52-801c-a70286d639de-config-volume\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.315269 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-proxy-tls\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.316031 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0a169c9-3504-4d10-a04f-0c9223b5acca-config-volume\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.319032 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.325669 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-srv-cert\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.325721 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3e33968-c99b-483d-8a53-e4d92fba4e12-srv-cert\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.325864 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9834654-a39c-4991-bd6b-db7c9e37b2d3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326109 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a01f67f-7cce-4bdc-8c89-10c4ac505f20-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fd92z\" (UID: \"3a01f67f-7cce-4bdc-8c89-10c4ac505f20\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326118 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326182 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3e33968-c99b-483d-8a53-e4d92fba4e12-profile-collector-cert\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326466 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fcb258-1307-45d4-bbff-c55e81ab1df3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326602 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326727 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7967c1f0-b5b5-4640-b3f0-38588f8af13c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326834 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a685905-0b88-4b62-977b-a84e13aa85f7-config\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.326862 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61a4ee31-59ca-4626-b1de-4d70fb7d8789-certs\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.327007 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a859c4-374b-4fc4-89c6-e2ae649fe43f-cert\") pod \"ingress-canary-sk5ff\" (UID: \"74a859c4-374b-4fc4-89c6-e2ae649fe43f\") " pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.327478 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b84820c2-fd0b-4e52-801c-a70286d639de-secret-volume\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.327523 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6f7\" (UniqueName: \"kubernetes.io/projected/f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4-kube-api-access-4d6f7\") pod \"ingress-operator-5b745b69d9-9mpgg\" (UID: \"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.327849 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0a169c9-3504-4d10-a04f-0c9223b5acca-metrics-tls\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.328496 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cn5r8\" (UID: \"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.328857 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-stats-auth\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.330960 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/214bd9b5-75be-4fc3-b973-1ca9d77431bf-signing-key\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.331000 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fef6870-fac6-49bc-8471-fb78198ba057-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wklrf\" (UID: \"2fef6870-fac6-49bc-8471-fb78198ba057\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.331271 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61a4ee31-59ca-4626-b1de-4d70fb7d8789-node-bootstrap-token\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.333455 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6773777c-949c-46da-95b7-c6008e52b396-default-certificate\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.339976 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.341419 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/669c03f6-55a4-4fee-b442-99cf9863678e-apiservice-cert\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.342594 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/669c03f6-55a4-4fee-b442-99cf9863678e-webhook-cert\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.348226 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72fcb258-1307-45d4-bbff-c55e81ab1df3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4qbb\" (UID: \"72fcb258-1307-45d4-bbff-c55e81ab1df3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.378169 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.378255 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl87m\" (UniqueName: \"kubernetes.io/projected/214bd9b5-75be-4fc3-b973-1ca9d77431bf-kube-api-access-rl87m\") pod \"service-ca-9c57cc56f-rsbwp\" (UID: \"214bd9b5-75be-4fc3-b973-1ca9d77431bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.378883 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:31.878849342 +0000 UTC m=+143.364558467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.379584 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.379804 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-76928"] Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.380784 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:31.880763373 +0000 UTC m=+143.366472488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.385444 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg9m9\" (UniqueName: \"kubernetes.io/projected/5a685905-0b88-4b62-977b-a84e13aa85f7-kube-api-access-bg9m9\") pod \"service-ca-operator-777779d784-v42j9\" (UID: \"5a685905-0b88-4b62-977b-a84e13aa85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: W1007 13:09:31.392048 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod364ed7ee_3c5a_4d7f_ba97_ddd52483de83.slice/crio-702551c4e7a382aa334f3af67b303054c101621b135c9e174732031b2ed4da1e WatchSource:0}: Error finding container 702551c4e7a382aa334f3af67b303054c101621b135c9e174732031b2ed4da1e: Status 404 returned error can't find the container with id 702551c4e7a382aa334f3af67b303054c101621b135c9e174732031b2ed4da1e Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.409269 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzv5\" (UniqueName: \"kubernetes.io/projected/d8c1b6d9-29d0-4888-b8bf-3380aabe575f-kube-api-access-rdzv5\") pod \"olm-operator-6b444d44fb-lprql\" (UID: \"d8c1b6d9-29d0-4888-b8bf-3380aabe575f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.423545 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn6pm\" (UniqueName: \"kubernetes.io/projected/9122c8d7-acc8-4ed0-81b0-79ea36536943-kube-api-access-xn6pm\") pod \"marketplace-operator-79b997595-r6nq5\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.432408 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.449887 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzhv8\" (UniqueName: \"kubernetes.io/projected/f0a169c9-3504-4d10-a04f-0c9223b5acca-kube-api-access-pzhv8\") pod \"dns-default-bt865\" (UID: \"f0a169c9-3504-4d10-a04f-0c9223b5acca\") " pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.456485 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.459874 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mwgkm"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.478887 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bt865" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.480114 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.480591 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:31.980575764 +0000 UTC m=+143.466284879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.482626 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94kw6\" (UniqueName: \"kubernetes.io/projected/61a4ee31-59ca-4626-b1de-4d70fb7d8789-kube-api-access-94kw6\") pod \"machine-config-server-9dgw6\" (UID: \"61a4ee31-59ca-4626-b1de-4d70fb7d8789\") " pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: W1007 13:09:31.500729 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67be2c46_f396_48d1_ba5e_d21f8362a4dc.slice/crio-9a066db80c45422653194ef0ed8c078461ba7f0cab6b6ff947fb2e54c9bc5874 WatchSource:0}: Error finding container 9a066db80c45422653194ef0ed8c078461ba7f0cab6b6ff947fb2e54c9bc5874: Status 404 returned error can't find the container with id 9a066db80c45422653194ef0ed8c078461ba7f0cab6b6ff947fb2e54c9bc5874 Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.504978 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps9jl\" (UniqueName: \"kubernetes.io/projected/669c03f6-55a4-4fee-b442-99cf9863678e-kube-api-access-ps9jl\") pod \"packageserver-d55dfcdfc-ctrht\" (UID: \"669c03f6-55a4-4fee-b442-99cf9863678e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.508204 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljbf\" (UniqueName: \"kubernetes.io/projected/b84820c2-fd0b-4e52-801c-a70286d639de-kube-api-access-jljbf\") pod \"collect-profiles-29330700-zvg2m\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.508652 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.522791 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9834654-a39c-4991-bd6b-db7c9e37b2d3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gmgcc\" (UID: \"c9834654-a39c-4991-bd6b-db7c9e37b2d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.562168 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlrmg\" (UniqueName: \"kubernetes.io/projected/4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8-kube-api-access-qlrmg\") pod \"multus-admission-controller-857f4d67dd-cn5r8\" (UID: \"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.565263 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp28n\" (UniqueName: \"kubernetes.io/projected/7967c1f0-b5b5-4640-b3f0-38588f8af13c-kube-api-access-sp28n\") pod \"kube-storage-version-migrator-operator-b67b599dd-s576r\" (UID: \"7967c1f0-b5b5-4640-b3f0-38588f8af13c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.582239 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2fjn\" (UniqueName: \"kubernetes.io/projected/74a859c4-374b-4fc4-89c6-e2ae649fe43f-kube-api-access-w2fjn\") pod \"ingress-canary-sk5ff\" (UID: \"74a859c4-374b-4fc4-89c6-e2ae649fe43f\") " pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.585121 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.585622 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.085605577 +0000 UTC m=+143.571314692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.604337 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtxh9\" (UniqueName: \"kubernetes.io/projected/56785c5a-ebb4-4921-ae74-5239c1e09cf5-kube-api-access-rtxh9\") pod \"migrator-59844c95c7-qgcjl\" (UID: \"56785c5a-ebb4-4921-ae74-5239c1e09cf5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.617545 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.623475 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hn2p2"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.626463 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.631043 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7k5\" (UniqueName: \"kubernetes.io/projected/3a01f67f-7cce-4bdc-8c89-10c4ac505f20-kube-api-access-zm7k5\") pod \"package-server-manager-789f6589d5-fd92z\" (UID: \"3a01f67f-7cce-4bdc-8c89-10c4ac505f20\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.634855 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.640392 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.652764 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45ppz\" (UniqueName: \"kubernetes.io/projected/f3e33968-c99b-483d-8a53-e4d92fba4e12-kube-api-access-45ppz\") pod \"catalog-operator-68c6474976-dqz8j\" (UID: \"f3e33968-c99b-483d-8a53-e4d92fba4e12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.653130 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.660114 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.664226 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnqg9\" (UniqueName: \"kubernetes.io/projected/6773777c-949c-46da-95b7-c6008e52b396-kube-api-access-bnqg9\") pod \"router-default-5444994796-7pwvn\" (UID: \"6773777c-949c-46da-95b7-c6008e52b396\") " pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.679359 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7224h\" (UniqueName: \"kubernetes.io/projected/1ed24530-102d-45f3-9d9e-e74a7fefdd7e-kube-api-access-7224h\") pod \"machine-config-controller-84d6567774-b9mxw\" (UID: \"1ed24530-102d-45f3-9d9e-e74a7fefdd7e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.681640 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.688854 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.689209 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.189194728 +0000 UTC m=+143.674903843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.689843 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.700453 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6z9\" (UniqueName: \"kubernetes.io/projected/d891b0d9-0d88-4bd7-9f01-0f5a4991dc92-kube-api-access-9f6z9\") pod \"machine-config-operator-74547568cd-gsr6h\" (UID: \"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.712544 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.713333 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.717233 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.733114 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xcssh"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.733375 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.737213 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9dhb\" (UniqueName: \"kubernetes.io/projected/1548b62c-1671-430a-9286-a999460ae8d3-kube-api-access-c9dhb\") pod \"csi-hostpathplugin-wpdhg\" (UID: \"1548b62c-1671-430a-9286-a999460ae8d3\") " pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.742042 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.761750 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.782595 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sk5ff" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.783190 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jh976\" (UID: \"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.784816 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9dgw6" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.785084 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rkv4v"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.789378 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rsbwp"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.801846 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.803100 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.808456 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ptn\" (UniqueName: \"kubernetes.io/projected/2fef6870-fac6-49bc-8471-fb78198ba057-kube-api-access-48ptn\") pod \"control-plane-machine-set-operator-78cbb6b69f-wklrf\" (UID: \"2fef6870-fac6-49bc-8471-fb78198ba057\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.809724 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.309697127 +0000 UTC m=+143.795406242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.842213 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v42j9"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.904007 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:31 crc kubenswrapper[4677]: E1007 13:09:31.904330 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.404315928 +0000 UTC m=+143.890025043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.945883 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bt865"] Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.946076 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.984585 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" Oct 07 13:09:31 crc kubenswrapper[4677]: I1007 13:09:31.984866 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.010135 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.010562 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.510546986 +0000 UTC m=+143.996256101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.010979 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.111014 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.111879 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.611859382 +0000 UTC m=+144.097568497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.193385 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r"] Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.198838 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc"] Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.213053 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.213392 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.713380727 +0000 UTC m=+144.199089842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.263968 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9"] Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.271517 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bv6vv" podStartSLOduration=124.271496689 podStartE2EDuration="2m4.271496689s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:32.270326096 +0000 UTC m=+143.756035211" watchObservedRunningTime="2025-10-07 13:09:32.271496689 +0000 UTC m=+143.757205804" Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.290949 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" event={"ID":"57faa01e-5137-4dde-9102-e80bb5891cc5","Type":"ContainerStarted","Data":"a92e0f247270593de72a199b2b7dd4d8446398d2d4f6e177750efef53485db6b"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.293410 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwgkm" event={"ID":"67be2c46-f396-48d1-ba5e-d21f8362a4dc","Type":"ContainerStarted","Data":"9a066db80c45422653194ef0ed8c078461ba7f0cab6b6ff947fb2e54c9bc5874"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.305244 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" event={"ID":"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3","Type":"ContainerStarted","Data":"cec2eaf8f9a79eb9dfaf287b3ffeeb0ef9088cbcf3c0f1f84493a64cbd18f395"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.306051 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m"] Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.314986 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.315406 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.815389699 +0000 UTC m=+144.301098814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.339688 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" podStartSLOduration=124.33966958 podStartE2EDuration="2m4.33966958s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:32.314418003 +0000 UTC m=+143.800127128" watchObservedRunningTime="2025-10-07 13:09:32.33966958 +0000 UTC m=+143.825378695" Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.391094 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" event={"ID":"69136696-f636-4c23-b89a-bfbb2eba3a85","Type":"ContainerStarted","Data":"99cc7644b98560adeb9a21972c8d11688e5cb56cc6faccfbedb1b6ed7cf358b0"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.413448 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" event={"ID":"579c8074-5c53-4e1f-a620-f04cbccf63aa","Type":"ContainerStarted","Data":"24f6d8569e24cb9bc61d61399fc99ba8726d8fde4ed02c50096abbac36b80cd1"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.422341 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.422712 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:32.922699516 +0000 UTC m=+144.408408631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.422951 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" event={"ID":"214bd9b5-75be-4fc3-b973-1ca9d77431bf","Type":"ContainerStarted","Data":"5cce1fffd325c391c265d747494dd427a6f3d0d29d840fdd9b9eb83ba7ad1570"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.425150 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" event={"ID":"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5","Type":"ContainerStarted","Data":"8006ac9a24115c8c88b33370ad6a9cf703f1c45a242139dc1f0c28d1abb4531d"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.427122 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" event={"ID":"5a685905-0b88-4b62-977b-a84e13aa85f7","Type":"ContainerStarted","Data":"a0a8632cef7e9749474bf94edfcd3c709f928daaeb4cc0d3c1b160439967588b"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.435565 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bt865" event={"ID":"f0a169c9-3504-4d10-a04f-0c9223b5acca","Type":"ContainerStarted","Data":"ff8bff639ac67dc16da4a5a1bf19e7be01f00ca85738853e58434760fa913b84"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.457803 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" event={"ID":"07dd67bc-303c-4a2a-bc68-e430cc5e63c2","Type":"ContainerStarted","Data":"c39917a531a5621c6374bfe9453c51e1ace2f771bc35aa26de2a23aa371cf5db"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.487237 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" event={"ID":"6edf078f-96f5-46e0-be5c-012e4799f320","Type":"ContainerStarted","Data":"49d98987b0f8eebf2b0586d9a588875999b41be2a9f0994c1d9ae73dc4589899"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.494057 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" event={"ID":"7fc2830b-0f4d-4b3b-89bc-6e589839077d","Type":"ContainerStarted","Data":"f5d06e093eccd12d760187f8d74f388b2f0696930a9423c6f3101ee4055667f5"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.494099 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" event={"ID":"7fc2830b-0f4d-4b3b-89bc-6e589839077d","Type":"ContainerStarted","Data":"ac4dff86cb5d33059b21e43bfa77aef61b6f4cdf260c0987f883fed39d07bb83"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.516681 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-76928" event={"ID":"364ed7ee-3c5a-4d7f-ba97-ddd52483de83","Type":"ContainerStarted","Data":"fcde7a5b4892c87b4ed3b41e7802f86fdeaa3fa0a460a71ba9cf8d2b43a25359"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.516726 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-76928" event={"ID":"364ed7ee-3c5a-4d7f-ba97-ddd52483de83","Type":"ContainerStarted","Data":"702551c4e7a382aa334f3af67b303054c101621b135c9e174732031b2ed4da1e"} Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.523245 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.523773 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.523848 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.023835006 +0000 UTC m=+144.509544121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.524018 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.524239 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.02423322 +0000 UTC m=+144.509942335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.524832 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.550635 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-n2kcb" podStartSLOduration=124.550603658 podStartE2EDuration="2m4.550603658s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:32.537871391 +0000 UTC m=+144.023580516" watchObservedRunningTime="2025-10-07 13:09:32.550603658 +0000 UTC m=+144.036312773" Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.628088 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.628948 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.128928381 +0000 UTC m=+144.614637496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.729723 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.730232 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.230220167 +0000 UTC m=+144.715929282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.841570 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.842074 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.34206017 +0000 UTC m=+144.827769285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.945652 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:32 crc kubenswrapper[4677]: E1007 13:09:32.946102 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.446087846 +0000 UTC m=+144.931796961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:32 crc kubenswrapper[4677]: I1007 13:09:32.961739 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.049317 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.049935 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.549899575 +0000 UTC m=+145.035608690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.050010 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.050536 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.550523288 +0000 UTC m=+145.036232403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.050895 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.070694 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.081619 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zvfd4" podStartSLOduration=125.081592637 podStartE2EDuration="2m5.081592637s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.060642559 +0000 UTC m=+144.546351674" watchObservedRunningTime="2025-10-07 13:09:33.081592637 +0000 UTC m=+144.567301752" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.107753 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" podStartSLOduration=125.107737957 podStartE2EDuration="2m5.107737957s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.106613865 +0000 UTC m=+144.592322980" watchObservedRunningTime="2025-10-07 13:09:33.107737957 +0000 UTC m=+144.593447072" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.155049 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.155395 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.655347863 +0000 UTC m=+145.141056988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.155593 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.155893 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.655885523 +0000 UTC m=+145.141594638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.206939 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.208800 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.214853 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" podStartSLOduration=124.214835385 podStartE2EDuration="2m4.214835385s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.204534528 +0000 UTC m=+144.690243663" watchObservedRunningTime="2025-10-07 13:09:33.214835385 +0000 UTC m=+144.700544500" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.266995 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.267331 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.767313441 +0000 UTC m=+145.253022556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.368267 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.370517 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.870504326 +0000 UTC m=+145.356213441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.373357 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.380229 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-b5qm4" podStartSLOduration=124.380213962 podStartE2EDuration="2m4.380213962s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.377618477 +0000 UTC m=+144.863327592" watchObservedRunningTime="2025-10-07 13:09:33.380213962 +0000 UTC m=+144.865923077" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.404345 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.443867 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r6nq5"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.471475 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.471578 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.971554383 +0000 UTC m=+145.457263498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.471815 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.472125 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:33.972113054 +0000 UTC m=+145.457822169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.630601 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cn5r8"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.631604 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.632376 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.132359232 +0000 UTC m=+145.618068347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.679975 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" event={"ID":"b84820c2-fd0b-4e52-801c-a70286d639de","Type":"ContainerStarted","Data":"f538f6f357103da92c37795caa9efcf1541ba6dad60f92ce602a4c73ae66687d"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.680028 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" event={"ID":"b84820c2-fd0b-4e52-801c-a70286d639de","Type":"ContainerStarted","Data":"a6325291ea6080c047f7f18967697b93b638f421760a479dd28ca2db501665c3"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.680988 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wt49f" podStartSLOduration=125.680967926 podStartE2EDuration="2m5.680967926s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.67999823 +0000 UTC m=+145.165707355" watchObservedRunningTime="2025-10-07 13:09:33.680967926 +0000 UTC m=+145.166677051" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.688202 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" event={"ID":"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4","Type":"ContainerStarted","Data":"9b795e52f46dc6ca7408189e0500d7d41080e9d37f027437edf172aeed926c53"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.736632 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.737904 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.738008 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" event={"ID":"6edf078f-96f5-46e0-be5c-012e4799f320","Type":"ContainerStarted","Data":"362610cde7cac3373eb8a41642c414ac42217276a85528f5b4ff899d22c6fc3d"} Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.738575 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.238563899 +0000 UTC m=+145.724273014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.757744 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" event={"ID":"c9834654-a39c-4991-bd6b-db7c9e37b2d3","Type":"ContainerStarted","Data":"3400e886b8e7acc6c5792ca368107d10d2c50032f6301b27d23fe641ed0eb3d5"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.764179 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-76928" podStartSLOduration=125.764163198 podStartE2EDuration="2m5.764163198s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.756467246 +0000 UTC m=+145.242176361" watchObservedRunningTime="2025-10-07 13:09:33.764163198 +0000 UTC m=+145.249872323" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.764474 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sk5ff"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.809166 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpdhg"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.813508 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" event={"ID":"12fff5e2-3fc4-4418-b0b8-04929a968823","Type":"ContainerStarted","Data":"7a985f223457005cf5342c8e5cb9c6733741b1d4aa5602a4fd0514db3a374bf3"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.813550 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" event={"ID":"12fff5e2-3fc4-4418-b0b8-04929a968823","Type":"ContainerStarted","Data":"d7d55ab0dffab5302e868a2e1e4076a7771e8d623d82c0e2da09cc821019d0f0"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.839598 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.839706 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.339690629 +0000 UTC m=+145.825399744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.840819 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.842241 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.342226332 +0000 UTC m=+145.827935447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.870574 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" podStartSLOduration=124.870560081 podStartE2EDuration="2m4.870560081s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.831897833 +0000 UTC m=+145.317606948" watchObservedRunningTime="2025-10-07 13:09:33.870560081 +0000 UTC m=+145.356269196" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.870712 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-psk9k" podStartSLOduration=125.870707296 podStartE2EDuration="2m5.870707296s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.862661551 +0000 UTC m=+145.348370656" watchObservedRunningTime="2025-10-07 13:09:33.870707296 +0000 UTC m=+145.356416411" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.881482 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" event={"ID":"5a685905-0b88-4b62-977b-a84e13aa85f7","Type":"ContainerStarted","Data":"f39dcefd95664d4c4f94ab5ed2695f9084e03a420535ae028108bf39827f5fbf"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.887223 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" event={"ID":"56785c5a-ebb4-4921-ae74-5239c1e09cf5","Type":"ContainerStarted","Data":"1412e8c7c58f88f4ca1dbb37ffe3487810069291157aee960203d5f9aef0f816"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.905573 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" event={"ID":"c3c43bec-10f0-4f24-b9f4-6d0cfb694cb5","Type":"ContainerStarted","Data":"3fa777d3a2f1b0bf336b16fb3d1cafb8847b43f8ac72749f88bdbd8ec9ccea4f"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.920303 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.923361 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" event={"ID":"7967c1f0-b5b5-4640-b3f0-38588f8af13c","Type":"ContainerStarted","Data":"7c4a62955fef37bfe8caafc33fa7ca8b9de887cb16e345d09cadaa673eea0cec"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.923393 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" event={"ID":"7967c1f0-b5b5-4640-b3f0-38588f8af13c","Type":"ContainerStarted","Data":"8f2f802fa00eebb5e612974de50ec0a2909d660f64431312d70434a6331e3a2b"} Oct 07 13:09:33 crc kubenswrapper[4677]: W1007 13:09:33.924743 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1548b62c_1671_430a_9286_a999460ae8d3.slice/crio-b2ca7e478148c8c34f8179c580fcc4c4b7077686381003b9564465fe80ec3b49 WatchSource:0}: Error finding container b2ca7e478148c8c34f8179c580fcc4c4b7077686381003b9564465fe80ec3b49: Status 404 returned error can't find the container with id b2ca7e478148c8c34f8179c580fcc4c4b7077686381003b9564465fe80ec3b49 Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.933418 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw"] Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.937056 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" event={"ID":"669c03f6-55a4-4fee-b442-99cf9863678e","Type":"ContainerStarted","Data":"ed48339e3f8457ab955f20876ea7afd39d149eaa039afaf5d0a3cba5c869780f"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.942411 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:33 crc kubenswrapper[4677]: E1007 13:09:33.943352 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.443331321 +0000 UTC m=+145.929040436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.945185 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" event={"ID":"07dd67bc-303c-4a2a-bc68-e430cc5e63c2","Type":"ContainerStarted","Data":"f371b38bac6c887812d6ad74a7c9b978b8eeb7b2c4977ca14d32fc5df51a68cd"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.947366 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" podStartSLOduration=124.947355548 podStartE2EDuration="2m4.947355548s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.946837389 +0000 UTC m=+145.432546514" watchObservedRunningTime="2025-10-07 13:09:33.947355548 +0000 UTC m=+145.433064663" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.952069 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" event={"ID":"579c8074-5c53-4e1f-a620-f04cbccf63aa","Type":"ContainerStarted","Data":"1b9caf4aa387931c5610ee8d87df5bf47d1326041a3ea46ef7de48961d7199e1"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.953127 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.964340 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" event={"ID":"72fcb258-1307-45d4-bbff-c55e81ab1df3","Type":"ContainerStarted","Data":"4754c182cd4e3a4e7f8529f74788025e6824dfe0bc0145d793260ea169a43466"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.976851 4677 patch_prober.go:28] interesting pod/console-operator-58897d9998-hn2p2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.976895 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" podUID="579c8074-5c53-4e1f-a620-f04cbccf63aa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.983481 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" event={"ID":"3a01f67f-7cce-4bdc-8c89-10c4ac505f20","Type":"ContainerStarted","Data":"69a995be4696f0970ccd9bfb2574b9206109bf546bbb21d9b1f6ed1969d763e1"} Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.986318 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s576r" podStartSLOduration=124.986299457 podStartE2EDuration="2m4.986299457s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:33.98394026 +0000 UTC m=+145.469649375" watchObservedRunningTime="2025-10-07 13:09:33.986299457 +0000 UTC m=+145.472008572" Oct 07 13:09:33 crc kubenswrapper[4677]: I1007 13:09:33.994535 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h"] Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.015920 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v42j9" podStartSLOduration=125.015897623 podStartE2EDuration="2m5.015897623s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.009895373 +0000 UTC m=+145.495604488" watchObservedRunningTime="2025-10-07 13:09:34.015897623 +0000 UTC m=+145.501606738" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.019373 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwgkm" event={"ID":"67be2c46-f396-48d1-ba5e-d21f8362a4dc","Type":"ContainerStarted","Data":"c61e4a07a2c4b9d82f4a9fef82957a619b2d37bc89e6ec87b3e67c7d87c60638"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.020418 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mwgkm" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.036484 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" event={"ID":"a5919bd4-5d4e-4aa1-9b66-36460e7e24f3","Type":"ContainerStarted","Data":"1708e2f49cac6b95afccf745cfbd4210d2c681cf1d94a487540ba9f248c99da2"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.038412 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7pwvn" event={"ID":"6773777c-949c-46da-95b7-c6008e52b396","Type":"ContainerStarted","Data":"ca7e7f549499fc640ff7c6de461f15c513f48b033cfc46f7b35398dfffab8fe1"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.038487 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7pwvn" event={"ID":"6773777c-949c-46da-95b7-c6008e52b396","Type":"ContainerStarted","Data":"93f3e46761785c6aacb1932e7768b2c40066bfff1e8fb1f786f27db2a7429e43"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.039544 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9dgw6" event={"ID":"61a4ee31-59ca-4626-b1de-4d70fb7d8789","Type":"ContainerStarted","Data":"6e5859ffacc092d02008c9e1cd0de92a4dc488875aeb10ed1c3d4cf84dfebb49"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.039583 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9dgw6" event={"ID":"61a4ee31-59ca-4626-b1de-4d70fb7d8789","Type":"ContainerStarted","Data":"13a3b84f4161746a5f46b2f81da6a50241fed0443eafae811c908f1a7e2d3507"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.040602 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" event={"ID":"f3e33968-c99b-483d-8a53-e4d92fba4e12","Type":"ContainerStarted","Data":"2f4c631ae08e052cf2e4ec0a38d282cc47b28acc36c01a0125af6ea08c366eef"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.041575 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.044965 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" event={"ID":"214bd9b5-75be-4fc3-b973-1ca9d77431bf","Type":"ContainerStarted","Data":"df84a5a19098ebf29e85bc2a58908b06cb2e89eb02b68f55a9e4d6411b4e901c"} Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.045307 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.046587 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b29lb" podStartSLOduration=125.046576368 podStartE2EDuration="2m5.046576368s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.044808443 +0000 UTC m=+145.530517558" watchObservedRunningTime="2025-10-07 13:09:34.046576368 +0000 UTC m=+145.532285483" Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.048727 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.548706246 +0000 UTC m=+146.034415361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.058093 4677 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgkm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.058148 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgkm" podUID="67be2c46-f396-48d1-ba5e-d21f8362a4dc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.072292 4677 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dqz8j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.072344 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" podUID="f3e33968-c99b-483d-8a53-e4d92fba4e12" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.084669 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7pwvn" podStartSLOduration=125.084655595 podStartE2EDuration="2m5.084655595s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.083550765 +0000 UTC m=+145.569259890" watchObservedRunningTime="2025-10-07 13:09:34.084655595 +0000 UTC m=+145.570364710" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.129331 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rsbwp" podStartSLOduration=125.129306633 podStartE2EDuration="2m5.129306633s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.120653996 +0000 UTC m=+145.606363111" watchObservedRunningTime="2025-10-07 13:09:34.129306633 +0000 UTC m=+145.615015788" Oct 07 13:09:34 crc kubenswrapper[4677]: W1007 13:09:34.150581 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ed24530_102d_45f3_9d9e_e74a7fefdd7e.slice/crio-9ec06825a59f7350c401e223faab9a6ae94e0931a7694e43531752e686e19a8d WatchSource:0}: Error finding container 9ec06825a59f7350c401e223faab9a6ae94e0931a7694e43531752e686e19a8d: Status 404 returned error can't find the container with id 9ec06825a59f7350c401e223faab9a6ae94e0931a7694e43531752e686e19a8d Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.152836 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.153826 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.653803532 +0000 UTC m=+146.139512647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.217862 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" podStartSLOduration=126.217843481 podStartE2EDuration="2m6.217843481s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.175249269 +0000 UTC m=+145.660958384" watchObservedRunningTime="2025-10-07 13:09:34.217843481 +0000 UTC m=+145.703552586" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.218537 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9dgw6" podStartSLOduration=6.218531257 podStartE2EDuration="6.218531257s" podCreationTimestamp="2025-10-07 13:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.203968622 +0000 UTC m=+145.689677737" watchObservedRunningTime="2025-10-07 13:09:34.218531257 +0000 UTC m=+145.704240372" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.245250 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mwgkm" podStartSLOduration=126.245235956 podStartE2EDuration="2m6.245235956s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.243348407 +0000 UTC m=+145.729057522" watchObservedRunningTime="2025-10-07 13:09:34.245235956 +0000 UTC m=+145.730945071" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.255185 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.255575 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.755554605 +0000 UTC m=+146.241263740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.276267 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" podStartSLOduration=126.276248654 podStartE2EDuration="2m6.276248654s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.27587503 +0000 UTC m=+145.761584145" watchObservedRunningTime="2025-10-07 13:09:34.276248654 +0000 UTC m=+145.761957769" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.310954 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" podStartSLOduration=125.310935706 podStartE2EDuration="2m5.310935706s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:34.310846823 +0000 UTC m=+145.796555928" watchObservedRunningTime="2025-10-07 13:09:34.310935706 +0000 UTC m=+145.796644831" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.362607 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.363049 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.863031668 +0000 UTC m=+146.348740783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.476535 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.476850 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:34.976839323 +0000 UTC m=+146.462548438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.563586 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.563626 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.578205 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.578506 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.078490352 +0000 UTC m=+146.564199467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.680999 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.681520 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.181508251 +0000 UTC m=+146.667217366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.683016 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.691326 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:34 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:34 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:34 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.691384 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.782411 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.782540 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.282515187 +0000 UTC m=+146.768224302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.782590 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.782901 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.28289028 +0000 UTC m=+146.768599385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.883608 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.884197 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.384182586 +0000 UTC m=+146.869891701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.902215 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.902270 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.926366 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:34 crc kubenswrapper[4677]: I1007 13:09:34.985209 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:34 crc kubenswrapper[4677]: E1007 13:09:34.985583 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.485567566 +0000 UTC m=+146.971276681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.073840 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sk5ff" event={"ID":"74a859c4-374b-4fc4-89c6-e2ae649fe43f","Type":"ContainerStarted","Data":"44ec7adbeda8082ce98356d775c6523204d07c72c8dd616c747c0fcb50e5c4ec"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.074114 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sk5ff" event={"ID":"74a859c4-374b-4fc4-89c6-e2ae649fe43f","Type":"ContainerStarted","Data":"367ddacf3d36f9a621a402ec57d16200c92f030975231b3194544688e182333c"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.078729 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" event={"ID":"1ed24530-102d-45f3-9d9e-e74a7fefdd7e","Type":"ContainerStarted","Data":"9eaf04bb27056f8995aa756d351b4d8449657cbe46b64b393233a9ae1094efbe"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.078788 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" event={"ID":"1ed24530-102d-45f3-9d9e-e74a7fefdd7e","Type":"ContainerStarted","Data":"9ec06825a59f7350c401e223faab9a6ae94e0931a7694e43531752e686e19a8d"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.087152 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" event={"ID":"d8c1b6d9-29d0-4888-b8bf-3380aabe575f","Type":"ContainerStarted","Data":"cf6346a8d9c5012f2682880dd62b1ed7447a83d2b351704376722ab6f2542948"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.087211 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" event={"ID":"d8c1b6d9-29d0-4888-b8bf-3380aabe575f","Type":"ContainerStarted","Data":"8ee2151d3310f69d9ff246da1084ec11ef627e36168b66e8fd087b3672b479fe"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.087734 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.087796 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.088168 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.588151919 +0000 UTC m=+147.073861034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.089349 4677 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lprql container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.089410 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" podUID="d8c1b6d9-29d0-4888-b8bf-3380aabe575f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.103954 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" event={"ID":"56785c5a-ebb4-4921-ae74-5239c1e09cf5","Type":"ContainerStarted","Data":"62f6104c64fd7f283bb0453117a1615072e13808a7e20a9039c5e9f2ffa995db"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.104012 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" event={"ID":"56785c5a-ebb4-4921-ae74-5239c1e09cf5","Type":"ContainerStarted","Data":"4e0df8b4fe447cd018898f709addec4ead50ccc821c1747e2d6b11931255f448"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.109723 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" event={"ID":"2fef6870-fac6-49bc-8471-fb78198ba057","Type":"ContainerStarted","Data":"2b1ab8d9bb446927e626b3d9f8d393d5d02a19d17bb8cd90baf88481356fc700"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.109768 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" event={"ID":"2fef6870-fac6-49bc-8471-fb78198ba057","Type":"ContainerStarted","Data":"8870bd31e7cde98889fda446b7cd01d947ac1cf0b31ce4366f2ec0e3341468dc"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.141594 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sk5ff" podStartSLOduration=7.141577449 podStartE2EDuration="7.141577449s" podCreationTimestamp="2025-10-07 13:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.095616953 +0000 UTC m=+146.581326068" watchObservedRunningTime="2025-10-07 13:09:35.141577449 +0000 UTC m=+146.627286564" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.156523 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bt865" event={"ID":"f0a169c9-3504-4d10-a04f-0c9223b5acca","Type":"ContainerStarted","Data":"fb436ad563dd256705c60e73ea6ed85bef08554a473a65d05830930ad57aa0c2"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.156593 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bt865" event={"ID":"f0a169c9-3504-4d10-a04f-0c9223b5acca","Type":"ContainerStarted","Data":"abd95c9bcb0ec11c66f816f706ebb1fb2020b1803329164fd63e9235b8ed201f"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.156700 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-bt865" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.174503 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" podStartSLOduration=126.174484616 podStartE2EDuration="2m6.174484616s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.142909278 +0000 UTC m=+146.628618393" watchObservedRunningTime="2025-10-07 13:09:35.174484616 +0000 UTC m=+146.660193731" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.176643 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" event={"ID":"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8","Type":"ContainerStarted","Data":"9863f003da4759febe65b87b1fb996c519bbd9b44fd29c55d9bce73fb6a9b367"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.176672 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" event={"ID":"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8","Type":"ContainerStarted","Data":"c84ca36a4111f9d9bc3cd653414a5fb6d193d55df3bc7ff3c78d1acf4ab4ca49"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.189410 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.191320 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.691306833 +0000 UTC m=+147.177015948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.192369 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" event={"ID":"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4","Type":"ContainerStarted","Data":"98c9bc2ba8448aec2f341782d553af6490cac12af744483d38f067537255b47a"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.192422 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" event={"ID":"f2c52f9a-2f5a-4bfe-9293-40e9b151fcd4","Type":"ContainerStarted","Data":"435b457cf94137d6ed07b5f848dcc026f1c60645a1b99917a0beb9ae30436eb3"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.206147 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wklrf" podStartSLOduration=126.206129757 podStartE2EDuration="2m6.206129757s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.205979162 +0000 UTC m=+146.691688287" watchObservedRunningTime="2025-10-07 13:09:35.206129757 +0000 UTC m=+146.691838872" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.207129 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qgcjl" podStartSLOduration=126.207122393 podStartE2EDuration="2m6.207122393s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.17676246 +0000 UTC m=+146.662471575" watchObservedRunningTime="2025-10-07 13:09:35.207122393 +0000 UTC m=+146.692831508" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.220127 4677 generic.go:334] "Generic (PLEG): container finished" podID="12fff5e2-3fc4-4418-b0b8-04929a968823" containerID="7a985f223457005cf5342c8e5cb9c6733741b1d4aa5602a4fd0514db3a374bf3" exitCode=0 Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.220245 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" event={"ID":"12fff5e2-3fc4-4418-b0b8-04929a968823","Type":"ContainerDied","Data":"7a985f223457005cf5342c8e5cb9c6733741b1d4aa5602a4fd0514db3a374bf3"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.220275 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" event={"ID":"12fff5e2-3fc4-4418-b0b8-04929a968823","Type":"ContainerStarted","Data":"cee1abacc83ef9920cf2e2ee55ea87352ff6e589b3918c6f5b3135c62edb2eb4"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.220861 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.237342 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" event={"ID":"1548b62c-1671-430a-9286-a999460ae8d3","Type":"ContainerStarted","Data":"b2ca7e478148c8c34f8179c580fcc4c4b7077686381003b9564465fe80ec3b49"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.253900 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" event={"ID":"3a01f67f-7cce-4bdc-8c89-10c4ac505f20","Type":"ContainerStarted","Data":"a74fb0ded92b79795b9684bbb6c2d7e39ac8554494197bdba67ccf01ef7b54da"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.253941 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" event={"ID":"3a01f67f-7cce-4bdc-8c89-10c4ac505f20","Type":"ContainerStarted","Data":"1968bbc07112dca56fadbf3b637bd21c66e855e32c0acbb9415dab8ba1692775"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.254539 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.256283 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" event={"ID":"9122c8d7-acc8-4ed0-81b0-79ea36536943","Type":"ContainerStarted","Data":"d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.256308 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" event={"ID":"9122c8d7-acc8-4ed0-81b0-79ea36536943","Type":"ContainerStarted","Data":"97106e1decb2379e7ff949abee25dcf9c3c6f5c69a622dbade3fad042fa25533"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.257007 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.258583 4677 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r6nq5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.258616 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" podUID="9122c8d7-acc8-4ed0-81b0-79ea36536943" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.262581 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" event={"ID":"07dd67bc-303c-4a2a-bc68-e430cc5e63c2","Type":"ContainerStarted","Data":"d242fe95afc5bc94f1f7c798ae4794c6233125af6b172fb646db0c45e3ac212c"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.265371 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9mpgg" podStartSLOduration=126.26535561 podStartE2EDuration="2m6.26535561s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.237735106 +0000 UTC m=+146.723444221" watchObservedRunningTime="2025-10-07 13:09:35.26535561 +0000 UTC m=+146.751064725" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.278900 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" event={"ID":"f3e33968-c99b-483d-8a53-e4d92fba4e12","Type":"ContainerStarted","Data":"8718cda73d301c8c883d6c8160cab58d96c77668f829a424c4ec6617be33db61"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.280610 4677 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dqz8j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.280662 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" podUID="f3e33968-c99b-483d-8a53-e4d92fba4e12" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.282752 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" event={"ID":"57faa01e-5137-4dde-9102-e80bb5891cc5","Type":"ContainerStarted","Data":"88d9ee5d6c445155fa2b9923712175731b05f324f367f5b273446167513c8989"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.284206 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" event={"ID":"72fcb258-1307-45d4-bbff-c55e81ab1df3","Type":"ContainerStarted","Data":"d9f6f065dba91296814ae92cba1fdd91613c5954cc2fbe847481c6173cc15aa4"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.292018 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" event={"ID":"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5","Type":"ContainerStarted","Data":"ccad2d4f8aa1a5957f06786b34cb4cf349f3bf87f9b8959aba4f1fd0fb001b7b"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.292064 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" event={"ID":"f5b08d09-cb5f-49e7-b4e3-d52b2e071ee5","Type":"ContainerStarted","Data":"912860ca5f7bd0b3065b396ee6b7100ba6330cb7d2a271f25a08e8f6533556f6"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.292267 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.292644 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.79262122 +0000 UTC m=+147.278330325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.292964 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.294538 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.79452097 +0000 UTC m=+147.280230205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.294536 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" event={"ID":"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92","Type":"ContainerStarted","Data":"c9702789a0a2e2546c5d135fb1bfc350bf86dda1c1c91c1386ba5668db7a320a"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.294582 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" event={"ID":"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92","Type":"ContainerStarted","Data":"114e8615893b715f1fee7e8c6f8d542927a1b54534927d23a3f1d34bfd7d3b45"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.295867 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" event={"ID":"669c03f6-55a4-4fee-b442-99cf9863678e","Type":"ContainerStarted","Data":"6291aa72e76a27a53108053875b835ec05db8f130503853df913818bedd3d1bb"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.296594 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.298421 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-xcssh" podStartSLOduration=126.298405072 podStartE2EDuration="2m6.298405072s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.297712267 +0000 UTC m=+146.783421382" watchObservedRunningTime="2025-10-07 13:09:35.298405072 +0000 UTC m=+146.784114187" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.307665 4677 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ctrht container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.307791 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" podUID="669c03f6-55a4-4fee-b442-99cf9863678e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.317569 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bt865" podStartSLOduration=7.317549594 podStartE2EDuration="7.317549594s" podCreationTimestamp="2025-10-07 13:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.267107504 +0000 UTC m=+146.752816619" watchObservedRunningTime="2025-10-07 13:09:35.317549594 +0000 UTC m=+146.803258699" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.328801 4677 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgkm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.328859 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgkm" podUID="67be2c46-f396-48d1-ba5e-d21f8362a4dc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.345828 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" podStartSLOduration=127.345806451 podStartE2EDuration="2m7.345806451s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.342849173 +0000 UTC m=+146.828558288" watchObservedRunningTime="2025-10-07 13:09:35.345806451 +0000 UTC m=+146.831515566" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.358958 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" event={"ID":"c9834654-a39c-4991-bd6b-db7c9e37b2d3","Type":"ContainerStarted","Data":"80583e69960bf80796192217d7f82ba2400f01ed57d2fcf1e7a0f177f994ac16"} Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.359058 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5x69q" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.361596 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" podStartSLOduration=126.361581289 podStartE2EDuration="2m6.361581289s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.36023986 +0000 UTC m=+146.845948975" watchObservedRunningTime="2025-10-07 13:09:35.361581289 +0000 UTC m=+146.847290404" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.392950 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" podStartSLOduration=126.392932019 podStartE2EDuration="2m6.392932019s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.392463302 +0000 UTC m=+146.878172427" watchObservedRunningTime="2025-10-07 13:09:35.392932019 +0000 UTC m=+146.878641144" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.393735 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.395409 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.895396819 +0000 UTC m=+147.381105934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.467668 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rkv4v" podStartSLOduration=126.46765265 podStartE2EDuration="2m6.46765265s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.467067069 +0000 UTC m=+146.952776174" watchObservedRunningTime="2025-10-07 13:09:35.46765265 +0000 UTC m=+146.953361765" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.467757 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4qbb" podStartSLOduration=126.467753044 podStartE2EDuration="2m6.467753044s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.428016876 +0000 UTC m=+146.913725991" watchObservedRunningTime="2025-10-07 13:09:35.467753044 +0000 UTC m=+146.953462159" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.478499 4677 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bkhfv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]log ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]etcd ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/generic-apiserver-start-informers ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/max-in-flight-filter ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 07 13:09:35 crc kubenswrapper[4677]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 07 13:09:35 crc kubenswrapper[4677]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/project.openshift.io-projectcache ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/openshift.io-startinformers ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 07 13:09:35 crc kubenswrapper[4677]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 07 13:09:35 crc kubenswrapper[4677]: livez check failed Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.478555 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" podUID="a5919bd4-5d4e-4aa1-9b66-36460e7e24f3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.495135 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jh976" podStartSLOduration=126.495120218 podStartE2EDuration="2m6.495120218s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.493747517 +0000 UTC m=+146.979456632" watchObservedRunningTime="2025-10-07 13:09:35.495120218 +0000 UTC m=+146.980829333" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.495988 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.496332 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:35.996319142 +0000 UTC m=+147.482028247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.550776 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmgcc" podStartSLOduration=126.550759359 podStartE2EDuration="2m6.550759359s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.550403136 +0000 UTC m=+147.036112251" watchObservedRunningTime="2025-10-07 13:09:35.550759359 +0000 UTC m=+147.036468464" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.597351 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.597790 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.097775984 +0000 UTC m=+147.583485099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.687817 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:35 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:35 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:35 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.688097 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.699374 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.699900 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.19988162 +0000 UTC m=+147.685590805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.794215 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hn2p2" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.800797 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.801218 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.301202257 +0000 UTC m=+147.786911372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.813494 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" podStartSLOduration=126.813475307 podStartE2EDuration="2m6.813475307s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:35.575776627 +0000 UTC m=+147.061485742" watchObservedRunningTime="2025-10-07 13:09:35.813475307 +0000 UTC m=+147.299184432" Oct 07 13:09:35 crc kubenswrapper[4677]: I1007 13:09:35.902842 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:35 crc kubenswrapper[4677]: E1007 13:09:35.903257 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.40324217 +0000 UTC m=+147.888951285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.004223 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.004608 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.504592838 +0000 UTC m=+147.990301943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.105492 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.105834 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.605822112 +0000 UTC m=+148.091531227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.206920 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.207617 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.707599065 +0000 UTC m=+148.193308180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.308578 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.309124 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.809113499 +0000 UTC m=+148.294822614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.324905 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" event={"ID":"1548b62c-1671-430a-9286-a999460ae8d3","Type":"ContainerStarted","Data":"7f441ef92ab827f4041fa1e1075a291be7267135577ae2f5cace113c1670214c"} Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.326587 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" event={"ID":"1ed24530-102d-45f3-9d9e-e74a7fefdd7e","Type":"ContainerStarted","Data":"9ce2795c34c539440de38354b9216f3f165563e5ed1223e56f9898297b31f6ef"} Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.328930 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" event={"ID":"d891b0d9-0d88-4bd7-9f01-0f5a4991dc92","Type":"ContainerStarted","Data":"808f65fea28cbf7470bc218b38f6d6dfc9881dba21b8c232ef67a93c80413b3e"} Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.330939 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" event={"ID":"4b5e8fd2-4cb0-4bfd-8ce1-17f58352a3d8","Type":"ContainerStarted","Data":"59972634672e864c217c5c4b629a835992b0d16ea743126b79cffdc246b942fe"} Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.333404 4677 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgkm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.333464 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgkm" podUID="67be2c46-f396-48d1-ba5e-d21f8362a4dc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.334381 4677 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r6nq5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.334410 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" podUID="9122c8d7-acc8-4ed0-81b0-79ea36536943" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.338636 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dqz8j" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.338870 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lprql" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.373043 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b9mxw" podStartSLOduration=127.373022334 podStartE2EDuration="2m7.373022334s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:36.372385701 +0000 UTC m=+147.858094816" watchObservedRunningTime="2025-10-07 13:09:36.373022334 +0000 UTC m=+147.858731449" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.415490 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.415686 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.915659048 +0000 UTC m=+148.401368173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.418594 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.425619 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:36.925607603 +0000 UTC m=+148.411316718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.521342 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.522221 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.022200397 +0000 UTC m=+148.507909512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.529247 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gsr6h" podStartSLOduration=127.529229275 podStartE2EDuration="2m7.529229275s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:36.491361755 +0000 UTC m=+147.977070890" watchObservedRunningTime="2025-10-07 13:09:36.529229275 +0000 UTC m=+148.014938390" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.571742 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-cn5r8" podStartSLOduration=127.571724023 podStartE2EDuration="2m7.571724023s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:36.530530502 +0000 UTC m=+148.016239627" watchObservedRunningTime="2025-10-07 13:09:36.571724023 +0000 UTC m=+148.057433138" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.623402 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.623780 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.123750512 +0000 UTC m=+148.609459627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.686627 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:36 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:36 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:36 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.686699 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.724460 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.724849 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.22483108 +0000 UTC m=+148.710540195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.826215 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.826580 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.326568753 +0000 UTC m=+148.812277868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.927907 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.928087 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.428061796 +0000 UTC m=+148.913770911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:36 crc kubenswrapper[4677]: I1007 13:09:36.928134 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:36 crc kubenswrapper[4677]: E1007 13:09:36.928470 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.42845777 +0000 UTC m=+148.914166885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.000116 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ctrht" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.028955 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:37 crc kubenswrapper[4677]: E1007 13:09:37.029143 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.529117693 +0000 UTC m=+149.014826808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.029239 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:37 crc kubenswrapper[4677]: E1007 13:09:37.029589 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.52958165 +0000 UTC m=+149.015290765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.062215 4677 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.130239 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:37 crc kubenswrapper[4677]: E1007 13:09:37.130448 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.630404979 +0000 UTC m=+149.116114094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.130763 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:37 crc kubenswrapper[4677]: E1007 13:09:37.131155 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.631137966 +0000 UTC m=+149.116847141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.145678 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-27x4s"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.152140 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.159836 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.161335 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27x4s"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.231893 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.232195 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.232266 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.232295 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5qxl\" (UniqueName: \"kubernetes.io/projected/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-kube-api-access-h5qxl\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.232323 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-utilities\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.232385 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-catalog-content\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: E1007 13:09:37.232530 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.732500664 +0000 UTC m=+149.218209779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.233744 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.238480 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.312675 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlxmv"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.313607 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.315488 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.328647 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlxmv"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.333284 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-utilities\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.333324 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.333352 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.333373 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.333400 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-catalog-content\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.333478 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qxl\" (UniqueName: \"kubernetes.io/projected/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-kube-api-access-h5qxl\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: E1007 13:09:37.333690 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.833674256 +0000 UTC m=+149.319383371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bldn4" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.334228 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-utilities\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.334534 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-catalog-content\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.337292 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.341685 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.354311 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5qxl\" (UniqueName: \"kubernetes.io/projected/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-kube-api-access-h5qxl\") pod \"certified-operators-27x4s\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.381591 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" event={"ID":"1548b62c-1671-430a-9286-a999460ae8d3","Type":"ContainerStarted","Data":"aa8c8ac3d4803dc0071da675800a93c91dde1892b9053d5b58cc501b6792aa7f"} Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.381650 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" event={"ID":"1548b62c-1671-430a-9286-a999460ae8d3","Type":"ContainerStarted","Data":"437d8b72893f5ecae34bd3f9e82cb7c3dae1d2fb3101a15107555c1d788d5a1b"} Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.381662 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" event={"ID":"1548b62c-1671-430a-9286-a999460ae8d3","Type":"ContainerStarted","Data":"9551bd03ea28ed5db0625795aacbd2312c0587cc9735ffedbdd48f60ff3a81d5"} Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.385177 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.391325 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjp9" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.410738 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wpdhg" podStartSLOduration=9.410720202 podStartE2EDuration="9.410720202s" podCreationTimestamp="2025-10-07 13:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:37.407671011 +0000 UTC m=+148.893380126" watchObservedRunningTime="2025-10-07 13:09:37.410720202 +0000 UTC m=+148.896429317" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.418666 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.434526 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.434706 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-utilities\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.434740 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5rg\" (UniqueName: \"kubernetes.io/projected/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-kube-api-access-xs5rg\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.434798 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-catalog-content\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: E1007 13:09:37.434890 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:09:37.934875569 +0000 UTC m=+149.420584684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.472047 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.480475 4677 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T13:09:37.062250329Z","Handler":null,"Name":""} Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.495118 4677 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.495152 4677 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.534812 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.536222 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.536320 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-utilities\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.536393 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5rg\" (UniqueName: \"kubernetes.io/projected/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-kube-api-access-xs5rg\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.536700 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-catalog-content\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.539397 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-utilities\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.540586 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-catalog-content\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.541305 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fr6vh"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.543019 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.544330 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fr6vh"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.544438 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.546398 4677 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.546461 4677 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.608728 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5rg\" (UniqueName: \"kubernetes.io/projected/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-kube-api-access-xs5rg\") pod \"community-operators-hlxmv\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.629834 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.647940 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bldn4\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.648718 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fhn\" (UniqueName: \"kubernetes.io/projected/f81da05e-fd1c-4a91-947d-f5d6958518d0-kube-api-access-w9fhn\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.648775 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-utilities\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.648800 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-catalog-content\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.687816 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:37 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:37 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:37 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.687855 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.715542 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vmcx7"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.717148 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.730555 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmcx7"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.751234 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.751425 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-utilities\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.751472 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-catalog-content\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.751522 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9fhn\" (UniqueName: \"kubernetes.io/projected/f81da05e-fd1c-4a91-947d-f5d6958518d0-kube-api-access-w9fhn\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.751967 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-utilities\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.752062 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-catalog-content\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.768545 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.777353 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9fhn\" (UniqueName: \"kubernetes.io/projected/f81da05e-fd1c-4a91-947d-f5d6958518d0-kube-api-access-w9fhn\") pod \"certified-operators-fr6vh\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.852696 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4tq9\" (UniqueName: \"kubernetes.io/projected/480ea359-f37d-4365-89c5-8f30e79f7c79-kube-api-access-l4tq9\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.852765 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-catalog-content\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.852809 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-utilities\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.868225 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27x4s"] Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.888492 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.904981 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:09:37 crc kubenswrapper[4677]: W1007 13:09:37.941731 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-0c9c2cdd1d199af6c1d0bb6e79b2b7530b47d13ed2603b9577a0faddd13cc1d4 WatchSource:0}: Error finding container 0c9c2cdd1d199af6c1d0bb6e79b2b7530b47d13ed2603b9577a0faddd13cc1d4: Status 404 returned error can't find the container with id 0c9c2cdd1d199af6c1d0bb6e79b2b7530b47d13ed2603b9577a0faddd13cc1d4 Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.954040 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-utilities\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.954110 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4tq9\" (UniqueName: \"kubernetes.io/projected/480ea359-f37d-4365-89c5-8f30e79f7c79-kube-api-access-l4tq9\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.954147 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-catalog-content\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.954539 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-catalog-content\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.954836 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-utilities\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.970696 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4tq9\" (UniqueName: \"kubernetes.io/projected/480ea359-f37d-4365-89c5-8f30e79f7c79-kube-api-access-l4tq9\") pod \"community-operators-vmcx7\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:37 crc kubenswrapper[4677]: I1007 13:09:37.973243 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlxmv"] Oct 07 13:09:37 crc kubenswrapper[4677]: W1007 13:09:37.977272 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9473eea_460d_4148_8b4f_f2e0ccba3b2e.slice/crio-96867bd0aea0617d04ab569896a67cae7c05aedb53327c99adfc7a7881b82872 WatchSource:0}: Error finding container 96867bd0aea0617d04ab569896a67cae7c05aedb53327c99adfc7a7881b82872: Status 404 returned error can't find the container with id 96867bd0aea0617d04ab569896a67cae7c05aedb53327c99adfc7a7881b82872 Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.045738 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.128077 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bldn4"] Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.198116 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fr6vh"] Oct 07 13:09:38 crc kubenswrapper[4677]: W1007 13:09:38.222594 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81da05e_fd1c_4a91_947d_f5d6958518d0.slice/crio-34d8b5e42fe6dd51f159d108517dc50f89c4e2bdcc1c785ca2c33f7ee4088826 WatchSource:0}: Error finding container 34d8b5e42fe6dd51f159d108517dc50f89c4e2bdcc1c785ca2c33f7ee4088826: Status 404 returned error can't find the container with id 34d8b5e42fe6dd51f159d108517dc50f89c4e2bdcc1c785ca2c33f7ee4088826 Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.300761 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmcx7"] Oct 07 13:09:38 crc kubenswrapper[4677]: W1007 13:09:38.308583 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480ea359_f37d_4365_89c5_8f30e79f7c79.slice/crio-45c25972dc11a541beb62dad3af849a37513ba7084345333ccb161dfa7f4e54f WatchSource:0}: Error finding container 45c25972dc11a541beb62dad3af849a37513ba7084345333ccb161dfa7f4e54f: Status 404 returned error can't find the container with id 45c25972dc11a541beb62dad3af849a37513ba7084345333ccb161dfa7f4e54f Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.401131 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fr6vh" event={"ID":"f81da05e-fd1c-4a91-947d-f5d6958518d0","Type":"ContainerStarted","Data":"34d8b5e42fe6dd51f159d108517dc50f89c4e2bdcc1c785ca2c33f7ee4088826"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.403203 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"61f5bf1336c2e1261b9d16171b6a1504334ac71361beedb5a68b2ee35ab7b3e1"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.404969 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" event={"ID":"51dd4275-14c4-459b-a065-46ae2b4fd741","Type":"ContainerStarted","Data":"3681657c57c95ca89d6ca84dcad897a7c45f44f2efe24da666bd9ff2b8f6f9b1"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.407053 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmcx7" event={"ID":"480ea359-f37d-4365-89c5-8f30e79f7c79","Type":"ContainerStarted","Data":"45c25972dc11a541beb62dad3af849a37513ba7084345333ccb161dfa7f4e54f"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.410608 4677 generic.go:334] "Generic (PLEG): container finished" podID="b84820c2-fd0b-4e52-801c-a70286d639de" containerID="f538f6f357103da92c37795caa9efcf1541ba6dad60f92ce602a4c73ae66687d" exitCode=0 Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.410717 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" event={"ID":"b84820c2-fd0b-4e52-801c-a70286d639de","Type":"ContainerDied","Data":"f538f6f357103da92c37795caa9efcf1541ba6dad60f92ce602a4c73ae66687d"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.418293 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlxmv" event={"ID":"c9473eea-460d-4148-8b4f-f2e0ccba3b2e","Type":"ContainerStarted","Data":"96867bd0aea0617d04ab569896a67cae7c05aedb53327c99adfc7a7881b82872"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.421592 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27x4s" event={"ID":"57d9fc75-7df6-4205-9600-0e0d0ff04f8a","Type":"ContainerStarted","Data":"56ae3a9e16ed281aa2dc4653fe272ead892537929b5a8125d10ea02e5066638d"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.425917 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"556260954d2d6e496c4a1aa65effa0da3dda3ac56db325225e7c006a106cb1f9"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.428484 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0c9c2cdd1d199af6c1d0bb6e79b2b7530b47d13ed2603b9577a0faddd13cc1d4"} Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.429326 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.687143 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:38 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:38 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:38 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:38 crc kubenswrapper[4677]: I1007 13:09:38.687621 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.129999 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-frwdt"] Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.132007 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.134368 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.144888 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frwdt"] Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.271354 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-catalog-content\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.271577 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wn5b\" (UniqueName: \"kubernetes.io/projected/8932abb2-d07f-45df-bd32-0ac930df1346-kube-api-access-6wn5b\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.271675 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-utilities\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.313109 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.340814 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.341610 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.343248 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.344851 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.350933 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.375943 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-catalog-content\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.376057 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn5b\" (UniqueName: \"kubernetes.io/projected/8932abb2-d07f-45df-bd32-0ac930df1346-kube-api-access-6wn5b\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.376094 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-utilities\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.376394 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-catalog-content\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.377367 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-utilities\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.397617 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wn5b\" (UniqueName: \"kubernetes.io/projected/8932abb2-d07f-45df-bd32-0ac930df1346-kube-api-access-6wn5b\") pod \"redhat-marketplace-frwdt\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.435949 4677 generic.go:334] "Generic (PLEG): container finished" podID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerID="9d23d80be02384015c4165dd8531b5a703386c8e6a048345d1c600ede802ec23" exitCode=0 Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.436008 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27x4s" event={"ID":"57d9fc75-7df6-4205-9600-0e0d0ff04f8a","Type":"ContainerDied","Data":"9d23d80be02384015c4165dd8531b5a703386c8e6a048345d1c600ede802ec23"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.437546 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5ee0f4df3a3c3f33a61448f1a0f0c902d7192d8a264f107e717c1fb4ac2487f9"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.439387 4677 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.443503 4677 generic.go:334] "Generic (PLEG): container finished" podID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerID="d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2" exitCode=0 Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.443585 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmcx7" event={"ID":"480ea359-f37d-4365-89c5-8f30e79f7c79","Type":"ContainerDied","Data":"d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.445494 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0fb1fed3cc99e8fbd1fdc7283686f77c517d8a59063bb8ccd1d3e2716a1a12c9"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.448395 4677 generic.go:334] "Generic (PLEG): container finished" podID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerID="b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a" exitCode=0 Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.448470 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fr6vh" event={"ID":"f81da05e-fd1c-4a91-947d-f5d6958518d0","Type":"ContainerDied","Data":"b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.461971 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"29ca86366d7e146fbb50febd78892ae360193b75a6669d9730e1019aeb8d0a38"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.465997 4677 generic.go:334] "Generic (PLEG): container finished" podID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerID="293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac" exitCode=0 Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.466047 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlxmv" event={"ID":"c9473eea-460d-4148-8b4f-f2e0ccba3b2e","Type":"ContainerDied","Data":"293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.470086 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.470320 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" event={"ID":"51dd4275-14c4-459b-a065-46ae2b4fd741","Type":"ContainerStarted","Data":"81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24"} Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.470691 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.478718 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.478807 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.539470 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-84v59"] Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.540733 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.542480 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-84v59"] Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.549252 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" podStartSLOduration=130.549232394 podStartE2EDuration="2m10.549232394s" podCreationTimestamp="2025-10-07 13:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:39.548804308 +0000 UTC m=+151.034513413" watchObservedRunningTime="2025-10-07 13:09:39.549232394 +0000 UTC m=+151.034941509" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.571948 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.583605 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.583683 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.585258 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bkhfv" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.585847 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.604460 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.659194 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.684819 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-catalog-content\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.684861 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmnr\" (UniqueName: \"kubernetes.io/projected/2defae4c-9ad5-42b5-89c7-100b68d49d6d-kube-api-access-5xmnr\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.684909 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-utilities\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.692645 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:39 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:39 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:39 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.692704 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.786596 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-catalog-content\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.786685 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmnr\" (UniqueName: \"kubernetes.io/projected/2defae4c-9ad5-42b5-89c7-100b68d49d6d-kube-api-access-5xmnr\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.786718 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-utilities\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.787102 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-utilities\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.787308 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-catalog-content\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.825172 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmnr\" (UniqueName: \"kubernetes.io/projected/2defae4c-9ad5-42b5-89c7-100b68d49d6d-kube-api-access-5xmnr\") pod \"redhat-marketplace-84v59\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.845344 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frwdt"] Oct 07 13:09:39 crc kubenswrapper[4677]: W1007 13:09:39.856111 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8932abb2_d07f_45df_bd32_0ac930df1346.slice/crio-b801dee5110267ee9a212216728b1d282f1b0f29ef826ca977476e21ad2a2d22 WatchSource:0}: Error finding container b801dee5110267ee9a212216728b1d282f1b0f29ef826ca977476e21ad2a2d22: Status 404 returned error can't find the container with id b801dee5110267ee9a212216728b1d282f1b0f29ef826ca977476e21ad2a2d22 Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.861551 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.870643 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.988680 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b84820c2-fd0b-4e52-801c-a70286d639de-secret-volume\") pod \"b84820c2-fd0b-4e52-801c-a70286d639de\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.988844 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b84820c2-fd0b-4e52-801c-a70286d639de-config-volume\") pod \"b84820c2-fd0b-4e52-801c-a70286d639de\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.988919 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jljbf\" (UniqueName: \"kubernetes.io/projected/b84820c2-fd0b-4e52-801c-a70286d639de-kube-api-access-jljbf\") pod \"b84820c2-fd0b-4e52-801c-a70286d639de\" (UID: \"b84820c2-fd0b-4e52-801c-a70286d639de\") " Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.990171 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b84820c2-fd0b-4e52-801c-a70286d639de-config-volume" (OuterVolumeSpecName: "config-volume") pod "b84820c2-fd0b-4e52-801c-a70286d639de" (UID: "b84820c2-fd0b-4e52-801c-a70286d639de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.990681 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.995824 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b84820c2-fd0b-4e52-801c-a70286d639de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b84820c2-fd0b-4e52-801c-a70286d639de" (UID: "b84820c2-fd0b-4e52-801c-a70286d639de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:09:39 crc kubenswrapper[4677]: I1007 13:09:39.996196 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84820c2-fd0b-4e52-801c-a70286d639de-kube-api-access-jljbf" (OuterVolumeSpecName: "kube-api-access-jljbf") pod "b84820c2-fd0b-4e52-801c-a70286d639de" (UID: "b84820c2-fd0b-4e52-801c-a70286d639de"). InnerVolumeSpecName "kube-api-access-jljbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:09:40 crc kubenswrapper[4677]: W1007 13:09:40.001194 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe90fc09_1c81_4cbe_b00c_6f6004f1dcf3.slice/crio-a24a4640e12d2c51747281954429ab7456d967bb084ff35a76838b056f168fcf WatchSource:0}: Error finding container a24a4640e12d2c51747281954429ab7456d967bb084ff35a76838b056f168fcf: Status 404 returned error can't find the container with id a24a4640e12d2c51747281954429ab7456d967bb084ff35a76838b056f168fcf Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.090325 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jljbf\" (UniqueName: \"kubernetes.io/projected/b84820c2-fd0b-4e52-801c-a70286d639de-kube-api-access-jljbf\") on node \"crc\" DevicePath \"\"" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.090605 4677 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b84820c2-fd0b-4e52-801c-a70286d639de-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.090615 4677 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b84820c2-fd0b-4e52-801c-a70286d639de-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.334480 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-84v59"] Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.481456 4677 generic.go:334] "Generic (PLEG): container finished" podID="8932abb2-d07f-45df-bd32-0ac930df1346" containerID="75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da" exitCode=0 Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.481620 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frwdt" event={"ID":"8932abb2-d07f-45df-bd32-0ac930df1346","Type":"ContainerDied","Data":"75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da"} Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.481667 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frwdt" event={"ID":"8932abb2-d07f-45df-bd32-0ac930df1346","Type":"ContainerStarted","Data":"b801dee5110267ee9a212216728b1d282f1b0f29ef826ca977476e21ad2a2d22"} Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.486620 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84v59" event={"ID":"2defae4c-9ad5-42b5-89c7-100b68d49d6d","Type":"ContainerStarted","Data":"928cfca758b48eaf9cfe70cbda95c7dfa3a149bb58c6997b293506e59952bd36"} Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.488881 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3","Type":"ContainerStarted","Data":"746087c339286f28f4ea3d6fe266c442a2c181ac5c9c021fc7bc2dd3b4ce27a4"} Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.488925 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3","Type":"ContainerStarted","Data":"a24a4640e12d2c51747281954429ab7456d967bb084ff35a76838b056f168fcf"} Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.494684 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.494817 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m" event={"ID":"b84820c2-fd0b-4e52-801c-a70286d639de","Type":"ContainerDied","Data":"a6325291ea6080c047f7f18967697b93b638f421760a479dd28ca2db501665c3"} Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.494873 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6325291ea6080c047f7f18967697b93b638f421760a479dd28ca2db501665c3" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.525493 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5cjf7"] Oct 07 13:09:40 crc kubenswrapper[4677]: E1007 13:09:40.525781 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84820c2-fd0b-4e52-801c-a70286d639de" containerName="collect-profiles" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.525792 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84820c2-fd0b-4e52-801c-a70286d639de" containerName="collect-profiles" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.525913 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84820c2-fd0b-4e52-801c-a70286d639de" containerName="collect-profiles" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.526660 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.530215 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.545565 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.5455455439999999 podStartE2EDuration="1.545545544s" podCreationTimestamp="2025-10-07 13:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:40.520153082 +0000 UTC m=+152.005862197" watchObservedRunningTime="2025-10-07 13:09:40.545545544 +0000 UTC m=+152.031254669" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.550292 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5cjf7"] Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.604893 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-catalog-content\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.605003 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwt44\" (UniqueName: \"kubernetes.io/projected/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-kube-api-access-lwt44\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.605137 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-utilities\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.686531 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:40 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:40 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:40 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.686615 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.707354 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-utilities\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.707474 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-catalog-content\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.707527 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwt44\" (UniqueName: \"kubernetes.io/projected/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-kube-api-access-lwt44\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.707867 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-utilities\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.707961 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-catalog-content\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.745513 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwt44\" (UniqueName: \"kubernetes.io/projected/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-kube-api-access-lwt44\") pod \"redhat-operators-5cjf7\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.857772 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.915739 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dxpgl"] Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.917909 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.918031 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.918365 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.924825 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxpgl"] Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.935998 4677 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgkm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.936047 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwgkm" podUID="67be2c46-f396-48d1-ba5e-d21f8362a4dc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.936478 4677 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgkm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.936516 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgkm" podUID="67be2c46-f396-48d1-ba5e-d21f8362a4dc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.975658 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.975788 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.977208 4677 patch_prober.go:28] interesting pod/console-f9d7485db-76928 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Oct 07 13:09:40 crc kubenswrapper[4677]: I1007 13:09:40.977250 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-76928" podUID="364ed7ee-3c5a-4d7f-ba97-ddd52483de83" containerName="console" probeResult="failure" output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.012066 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-catalog-content\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.012691 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dxn\" (UniqueName: \"kubernetes.io/projected/bcd3f7d5-1836-457c-b328-2dc358fd288c-kube-api-access-l5dxn\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.012741 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-utilities\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.114828 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-utilities\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.114991 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-catalog-content\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.115022 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dxn\" (UniqueName: \"kubernetes.io/projected/bcd3f7d5-1836-457c-b328-2dc358fd288c-kube-api-access-l5dxn\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.115479 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-utilities\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.115751 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-catalog-content\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.134884 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dxn\" (UniqueName: \"kubernetes.io/projected/bcd3f7d5-1836-457c-b328-2dc358fd288c-kube-api-access-l5dxn\") pod \"redhat-operators-dxpgl\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.182499 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5cjf7"] Oct 07 13:09:41 crc kubenswrapper[4677]: W1007 13:09:41.190250 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ecf8a0f_0b5b_42c2_80c5_cb0a82421387.slice/crio-59cff51fb4fbc9e7a8234bf09b605a6e134542e1dd127ec71ea988ad652d1784 WatchSource:0}: Error finding container 59cff51fb4fbc9e7a8234bf09b605a6e134542e1dd127ec71ea988ad652d1784: Status 404 returned error can't find the container with id 59cff51fb4fbc9e7a8234bf09b605a6e134542e1dd127ec71ea988ad652d1784 Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.245412 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.515322 4677 generic.go:334] "Generic (PLEG): container finished" podID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerID="49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2" exitCode=0 Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.515416 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cjf7" event={"ID":"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387","Type":"ContainerDied","Data":"49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2"} Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.515456 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cjf7" event={"ID":"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387","Type":"ContainerStarted","Data":"59cff51fb4fbc9e7a8234bf09b605a6e134542e1dd127ec71ea988ad652d1784"} Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.526279 4677 generic.go:334] "Generic (PLEG): container finished" podID="fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3" containerID="746087c339286f28f4ea3d6fe266c442a2c181ac5c9c021fc7bc2dd3b4ce27a4" exitCode=0 Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.526410 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3","Type":"ContainerDied","Data":"746087c339286f28f4ea3d6fe266c442a2c181ac5c9c021fc7bc2dd3b4ce27a4"} Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.536044 4677 generic.go:334] "Generic (PLEG): container finished" podID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerID="2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea" exitCode=0 Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.536085 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84v59" event={"ID":"2defae4c-9ad5-42b5-89c7-100b68d49d6d","Type":"ContainerDied","Data":"2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea"} Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.566620 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxpgl"] Oct 07 13:09:41 crc kubenswrapper[4677]: W1007 13:09:41.591316 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcd3f7d5_1836_457c_b328_2dc358fd288c.slice/crio-f7d4c7dbb6091f73a4e7b0819c847f407db4c9f2aed797bb417935d263204dd0 WatchSource:0}: Error finding container f7d4c7dbb6091f73a4e7b0819c847f407db4c9f2aed797bb417935d263204dd0: Status 404 returned error can't find the container with id f7d4c7dbb6091f73a4e7b0819c847f407db4c9f2aed797bb417935d263204dd0 Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.682335 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.685665 4677 patch_prober.go:28] interesting pod/router-default-5444994796-7pwvn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:09:41 crc kubenswrapper[4677]: [-]has-synced failed: reason withheld Oct 07 13:09:41 crc kubenswrapper[4677]: [+]process-running ok Oct 07 13:09:41 crc kubenswrapper[4677]: healthz check failed Oct 07 13:09:41 crc kubenswrapper[4677]: I1007 13:09:41.685789 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7pwvn" podUID="6773777c-949c-46da-95b7-c6008e52b396" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.502699 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.503383 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.510859 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.511116 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.512231 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.614639 4677 generic.go:334] "Generic (PLEG): container finished" podID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerID="da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee" exitCode=0 Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.615348 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxpgl" event={"ID":"bcd3f7d5-1836-457c-b328-2dc358fd288c","Type":"ContainerDied","Data":"da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee"} Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.615370 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxpgl" event={"ID":"bcd3f7d5-1836-457c-b328-2dc358fd288c","Type":"ContainerStarted","Data":"f7d4c7dbb6091f73a4e7b0819c847f407db4c9f2aed797bb417935d263204dd0"} Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.660090 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b269f98-8c39-4496-b4ab-84147ace5e15-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.660339 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b269f98-8c39-4496-b4ab-84147ace5e15-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.685914 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.690097 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7pwvn" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.762072 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b269f98-8c39-4496-b4ab-84147ace5e15-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.762161 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b269f98-8c39-4496-b4ab-84147ace5e15-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.762694 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b269f98-8c39-4496-b4ab-84147ace5e15-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.781875 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b269f98-8c39-4496-b4ab-84147ace5e15-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.859104 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:42 crc kubenswrapper[4677]: I1007 13:09:42.970227 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.069692 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kubelet-dir\") pod \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.069921 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kube-api-access\") pod \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\" (UID: \"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3\") " Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.070818 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3" (UID: "fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.090799 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3" (UID: "fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.174167 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.174207 4677 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.200732 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.494553 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bt865" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.638056 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3","Type":"ContainerDied","Data":"a24a4640e12d2c51747281954429ab7456d967bb084ff35a76838b056f168fcf"} Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.638100 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24a4640e12d2c51747281954429ab7456d967bb084ff35a76838b056f168fcf" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.638161 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:09:43 crc kubenswrapper[4677]: I1007 13:09:43.645481 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b269f98-8c39-4496-b4ab-84147ace5e15","Type":"ContainerStarted","Data":"fdd6e08a4a96d93cea617cf3f114a41d8ed38580806cea2752830f2d964d3ef4"} Oct 07 13:09:44 crc kubenswrapper[4677]: I1007 13:09:44.664736 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b269f98-8c39-4496-b4ab-84147ace5e15","Type":"ContainerStarted","Data":"1faf82974827530eae11c8018ba4b5f8cbf7ab43048b7713885dd40b71796ac7"} Oct 07 13:09:45 crc kubenswrapper[4677]: I1007 13:09:45.690731 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.690716406 podStartE2EDuration="3.690716406s" podCreationTimestamp="2025-10-07 13:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:09:45.689857864 +0000 UTC m=+157.175566979" watchObservedRunningTime="2025-10-07 13:09:45.690716406 +0000 UTC m=+157.176425511" Oct 07 13:09:46 crc kubenswrapper[4677]: I1007 13:09:46.677783 4677 generic.go:334] "Generic (PLEG): container finished" podID="8b269f98-8c39-4496-b4ab-84147ace5e15" containerID="1faf82974827530eae11c8018ba4b5f8cbf7ab43048b7713885dd40b71796ac7" exitCode=0 Oct 07 13:09:46 crc kubenswrapper[4677]: I1007 13:09:46.677862 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b269f98-8c39-4496-b4ab-84147ace5e15","Type":"ContainerDied","Data":"1faf82974827530eae11c8018ba4b5f8cbf7ab43048b7713885dd40b71796ac7"} Oct 07 13:09:50 crc kubenswrapper[4677]: I1007 13:09:50.928968 4677 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgkm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 07 13:09:50 crc kubenswrapper[4677]: I1007 13:09:50.929592 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwgkm" podUID="67be2c46-f396-48d1-ba5e-d21f8362a4dc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 07 13:09:50 crc kubenswrapper[4677]: I1007 13:09:50.929001 4677 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgkm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 07 13:09:50 crc kubenswrapper[4677]: I1007 13:09:50.929667 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgkm" podUID="67be2c46-f396-48d1-ba5e-d21f8362a4dc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 07 13:09:51 crc kubenswrapper[4677]: I1007 13:09:51.000685 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:51 crc kubenswrapper[4677]: I1007 13:09:51.005556 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-76928" Oct 07 13:09:51 crc kubenswrapper[4677]: I1007 13:09:51.189660 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:51 crc kubenswrapper[4677]: I1007 13:09:51.195571 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63a77a6-7e4a-4ed0-a996-b8f80233d10c-metrics-certs\") pod \"network-metrics-daemon-8bljr\" (UID: \"f63a77a6-7e4a-4ed0-a996-b8f80233d10c\") " pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:51 crc kubenswrapper[4677]: I1007 13:09:51.224053 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bljr" Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.685506 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.739509 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b269f98-8c39-4496-b4ab-84147ace5e15-kube-api-access\") pod \"8b269f98-8c39-4496-b4ab-84147ace5e15\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.739577 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b269f98-8c39-4496-b4ab-84147ace5e15-kubelet-dir\") pod \"8b269f98-8c39-4496-b4ab-84147ace5e15\" (UID: \"8b269f98-8c39-4496-b4ab-84147ace5e15\") " Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.740093 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b269f98-8c39-4496-b4ab-84147ace5e15-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b269f98-8c39-4496-b4ab-84147ace5e15" (UID: "8b269f98-8c39-4496-b4ab-84147ace5e15"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.746227 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b269f98-8c39-4496-b4ab-84147ace5e15-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b269f98-8c39-4496-b4ab-84147ace5e15" (UID: "8b269f98-8c39-4496-b4ab-84147ace5e15"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.749348 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b269f98-8c39-4496-b4ab-84147ace5e15","Type":"ContainerDied","Data":"fdd6e08a4a96d93cea617cf3f114a41d8ed38580806cea2752830f2d964d3ef4"} Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.749398 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd6e08a4a96d93cea617cf3f114a41d8ed38580806cea2752830f2d964d3ef4" Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.749450 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.841548 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b269f98-8c39-4496-b4ab-84147ace5e15-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:09:54 crc kubenswrapper[4677]: I1007 13:09:54.841589 4677 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b269f98-8c39-4496-b4ab-84147ace5e15-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:09:57 crc kubenswrapper[4677]: I1007 13:09:57.899509 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:10:00 crc kubenswrapper[4677]: I1007 13:10:00.934369 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mwgkm" Oct 07 13:10:10 crc kubenswrapper[4677]: I1007 13:10:10.917515 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:10:10 crc kubenswrapper[4677]: I1007 13:10:10.917978 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:10:11 crc kubenswrapper[4677]: I1007 13:10:11.736592 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fd92z" Oct 07 13:10:17 crc kubenswrapper[4677]: E1007 13:10:17.172229 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 13:10:17 crc kubenswrapper[4677]: E1007 13:10:17.173133 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5qxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-27x4s_openshift-marketplace(57d9fc75-7df6-4205-9600-0e0d0ff04f8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:17 crc kubenswrapper[4677]: E1007 13:10:17.174703 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-27x4s" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" Oct 07 13:10:17 crc kubenswrapper[4677]: I1007 13:10:17.553316 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:10:18 crc kubenswrapper[4677]: E1007 13:10:18.761036 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-27x4s" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" Oct 07 13:10:19 crc kubenswrapper[4677]: E1007 13:10:19.373526 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 13:10:19 crc kubenswrapper[4677]: E1007 13:10:19.373697 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4tq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vmcx7_openshift-marketplace(480ea359-f37d-4365-89c5-8f30e79f7c79): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:19 crc kubenswrapper[4677]: E1007 13:10:19.374930 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vmcx7" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" Oct 07 13:10:19 crc kubenswrapper[4677]: E1007 13:10:19.503407 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 13:10:19 crc kubenswrapper[4677]: E1007 13:10:19.503559 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w9fhn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fr6vh_openshift-marketplace(f81da05e-fd1c-4a91-947d-f5d6958518d0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:19 crc kubenswrapper[4677]: E1007 13:10:19.504655 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fr6vh" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.746722 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vmcx7" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.746723 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fr6vh" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.836072 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.836259 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5dxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dxpgl_openshift-marketplace(bcd3f7d5-1836-457c-b328-2dc358fd288c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.837551 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dxpgl" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.841986 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.842098 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lwt44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5cjf7_openshift-marketplace(1ecf8a0f-0b5b-42c2-80c5-cb0a82421387): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:21 crc kubenswrapper[4677]: E1007 13:10:21.843231 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5cjf7" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" Oct 07 13:10:24 crc kubenswrapper[4677]: E1007 13:10:24.545904 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dxpgl" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" Oct 07 13:10:24 crc kubenswrapper[4677]: E1007 13:10:24.545914 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5cjf7" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" Oct 07 13:10:24 crc kubenswrapper[4677]: E1007 13:10:24.607913 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 13:10:24 crc kubenswrapper[4677]: E1007 13:10:24.608073 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs5rg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hlxmv_openshift-marketplace(c9473eea-460d-4148-8b4f-f2e0ccba3b2e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:24 crc kubenswrapper[4677]: E1007 13:10:24.609508 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hlxmv" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" Oct 07 13:10:24 crc kubenswrapper[4677]: I1007 13:10:24.910373 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8bljr"] Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.268331 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hlxmv" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" Oct 07 13:10:25 crc kubenswrapper[4677]: W1007 13:10:25.272185 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63a77a6_7e4a_4ed0_a996_b8f80233d10c.slice/crio-7d5ecdcea28e32cd3a61bd84ee338af8c32905d2984de253cfe00c5c222d04f4 WatchSource:0}: Error finding container 7d5ecdcea28e32cd3a61bd84ee338af8c32905d2984de253cfe00c5c222d04f4: Status 404 returned error can't find the container with id 7d5ecdcea28e32cd3a61bd84ee338af8c32905d2984de253cfe00c5c222d04f4 Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.336041 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.336195 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5xmnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-84v59_openshift-marketplace(2defae4c-9ad5-42b5-89c7-100b68d49d6d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.337604 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-84v59" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.348157 4677 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.348391 4677 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wn5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-frwdt_openshift-marketplace(8932abb2-d07f-45df-bd32-0ac930df1346): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.349624 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-frwdt" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" Oct 07 13:10:25 crc kubenswrapper[4677]: I1007 13:10:25.928155 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8bljr" event={"ID":"f63a77a6-7e4a-4ed0-a996-b8f80233d10c","Type":"ContainerStarted","Data":"637f0bf0982ff112358ba93f12ed9cc7b4081e54ac99a473210525e551859035"} Oct 07 13:10:25 crc kubenswrapper[4677]: I1007 13:10:25.929571 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8bljr" event={"ID":"f63a77a6-7e4a-4ed0-a996-b8f80233d10c","Type":"ContainerStarted","Data":"45c47b1b67124ceb8e7e62d485b9a4f9ba670e9845bf7e0a66428241965bdf23"} Oct 07 13:10:25 crc kubenswrapper[4677]: I1007 13:10:25.929703 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8bljr" event={"ID":"f63a77a6-7e4a-4ed0-a996-b8f80233d10c","Type":"ContainerStarted","Data":"7d5ecdcea28e32cd3a61bd84ee338af8c32905d2984de253cfe00c5c222d04f4"} Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.930982 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-frwdt" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" Oct 07 13:10:25 crc kubenswrapper[4677]: E1007 13:10:25.932136 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-84v59" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" Oct 07 13:10:25 crc kubenswrapper[4677]: I1007 13:10:25.960500 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8bljr" podStartSLOduration=177.960479821 podStartE2EDuration="2m57.960479821s" podCreationTimestamp="2025-10-07 13:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:10:25.958470203 +0000 UTC m=+197.444179328" watchObservedRunningTime="2025-10-07 13:10:25.960479821 +0000 UTC m=+197.446188966" Oct 07 13:10:31 crc kubenswrapper[4677]: I1007 13:10:31.976331 4677 generic.go:334] "Generic (PLEG): container finished" podID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerID="37a6f1bef96a33c09a37a5861bd31c7b32f9d9c7832b94773f92af63867fdbb4" exitCode=0 Oct 07 13:10:31 crc kubenswrapper[4677]: I1007 13:10:31.976406 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27x4s" event={"ID":"57d9fc75-7df6-4205-9600-0e0d0ff04f8a","Type":"ContainerDied","Data":"37a6f1bef96a33c09a37a5861bd31c7b32f9d9c7832b94773f92af63867fdbb4"} Oct 07 13:10:35 crc kubenswrapper[4677]: I1007 13:10:35.033022 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27x4s" event={"ID":"57d9fc75-7df6-4205-9600-0e0d0ff04f8a","Type":"ContainerStarted","Data":"07b90718fa1794fc4a0ca8352d664a3f198762e762f6dc658ab972880a1c62cb"} Oct 07 13:10:35 crc kubenswrapper[4677]: I1007 13:10:35.055702 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-27x4s" podStartSLOduration=3.420162623 podStartE2EDuration="58.055678546s" podCreationTimestamp="2025-10-07 13:09:37 +0000 UTC" firstStartedPulling="2025-10-07 13:09:39.439096763 +0000 UTC m=+150.924805878" lastFinishedPulling="2025-10-07 13:10:34.074612686 +0000 UTC m=+205.560321801" observedRunningTime="2025-10-07 13:10:35.050468865 +0000 UTC m=+206.536178020" watchObservedRunningTime="2025-10-07 13:10:35.055678546 +0000 UTC m=+206.541387671" Oct 07 13:10:36 crc kubenswrapper[4677]: I1007 13:10:36.043267 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fr6vh" event={"ID":"f81da05e-fd1c-4a91-947d-f5d6958518d0","Type":"ContainerStarted","Data":"e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4"} Oct 07 13:10:37 crc kubenswrapper[4677]: I1007 13:10:37.052537 4677 generic.go:334] "Generic (PLEG): container finished" podID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerID="e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4" exitCode=0 Oct 07 13:10:37 crc kubenswrapper[4677]: I1007 13:10:37.052581 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fr6vh" event={"ID":"f81da05e-fd1c-4a91-947d-f5d6958518d0","Type":"ContainerDied","Data":"e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4"} Oct 07 13:10:37 crc kubenswrapper[4677]: I1007 13:10:37.472819 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:10:37 crc kubenswrapper[4677]: I1007 13:10:37.473228 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:10:39 crc kubenswrapper[4677]: I1007 13:10:39.656157 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-27x4s" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="registry-server" probeResult="failure" output=< Oct 07 13:10:39 crc kubenswrapper[4677]: timeout: failed to connect service ":50051" within 1s Oct 07 13:10:39 crc kubenswrapper[4677]: > Oct 07 13:10:40 crc kubenswrapper[4677]: I1007 13:10:40.917725 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:10:40 crc kubenswrapper[4677]: I1007 13:10:40.918116 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:10:40 crc kubenswrapper[4677]: I1007 13:10:40.918187 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:10:40 crc kubenswrapper[4677]: I1007 13:10:40.919070 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:10:40 crc kubenswrapper[4677]: I1007 13:10:40.919261 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618" gracePeriod=600 Oct 07 13:10:41 crc kubenswrapper[4677]: E1007 13:10:41.268323 4677 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7879fa59_a7cb_4d29_ba3a_c91f43bfcba6.slice/crio-conmon-5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:10:42 crc kubenswrapper[4677]: I1007 13:10:42.090494 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618" exitCode=0 Oct 07 13:10:42 crc kubenswrapper[4677]: I1007 13:10:42.090569 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.097832 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxpgl" event={"ID":"bcd3f7d5-1836-457c-b328-2dc358fd288c","Type":"ContainerStarted","Data":"93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.099888 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"82a5b7d40ad019c3c617ec7d72d51f1fbb5c958d11768560cf6d8828b0539b5b"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.102860 4677 generic.go:334] "Generic (PLEG): container finished" podID="8932abb2-d07f-45df-bd32-0ac930df1346" containerID="d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083" exitCode=0 Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.102911 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frwdt" event={"ID":"8932abb2-d07f-45df-bd32-0ac930df1346","Type":"ContainerDied","Data":"d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.108882 4677 generic.go:334] "Generic (PLEG): container finished" podID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerID="26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395" exitCode=0 Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.108947 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84v59" event={"ID":"2defae4c-9ad5-42b5-89c7-100b68d49d6d","Type":"ContainerDied","Data":"26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.112838 4677 generic.go:334] "Generic (PLEG): container finished" podID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerID="4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f" exitCode=0 Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.112931 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmcx7" event={"ID":"480ea359-f37d-4365-89c5-8f30e79f7c79","Type":"ContainerDied","Data":"4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.120250 4677 generic.go:334] "Generic (PLEG): container finished" podID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerID="cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc" exitCode=0 Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.120327 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cjf7" event={"ID":"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387","Type":"ContainerDied","Data":"cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.125484 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fr6vh" event={"ID":"f81da05e-fd1c-4a91-947d-f5d6958518d0","Type":"ContainerStarted","Data":"4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.130542 4677 generic.go:334] "Generic (PLEG): container finished" podID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerID="760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7" exitCode=0 Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.130598 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlxmv" event={"ID":"c9473eea-460d-4148-8b4f-f2e0ccba3b2e","Type":"ContainerDied","Data":"760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7"} Oct 07 13:10:43 crc kubenswrapper[4677]: I1007 13:10:43.256000 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fr6vh" podStartSLOduration=4.394513856 podStartE2EDuration="1m6.25597935s" podCreationTimestamp="2025-10-07 13:09:37 +0000 UTC" firstStartedPulling="2025-10-07 13:09:39.453039065 +0000 UTC m=+150.938748180" lastFinishedPulling="2025-10-07 13:10:41.314504549 +0000 UTC m=+212.800213674" observedRunningTime="2025-10-07 13:10:43.252503449 +0000 UTC m=+214.738212594" watchObservedRunningTime="2025-10-07 13:10:43.25597935 +0000 UTC m=+214.741688465" Oct 07 13:10:44 crc kubenswrapper[4677]: I1007 13:10:44.136613 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlxmv" event={"ID":"c9473eea-460d-4148-8b4f-f2e0ccba3b2e","Type":"ContainerStarted","Data":"443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8"} Oct 07 13:10:44 crc kubenswrapper[4677]: I1007 13:10:44.139880 4677 generic.go:334] "Generic (PLEG): container finished" podID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerID="93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce" exitCode=0 Oct 07 13:10:44 crc kubenswrapper[4677]: I1007 13:10:44.140050 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxpgl" event={"ID":"bcd3f7d5-1836-457c-b328-2dc358fd288c","Type":"ContainerDied","Data":"93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce"} Oct 07 13:10:44 crc kubenswrapper[4677]: I1007 13:10:44.143298 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84v59" event={"ID":"2defae4c-9ad5-42b5-89c7-100b68d49d6d","Type":"ContainerStarted","Data":"01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176"} Oct 07 13:10:44 crc kubenswrapper[4677]: I1007 13:10:44.175139 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-84v59" podStartSLOduration=3.096987643 podStartE2EDuration="1m5.175121912s" podCreationTimestamp="2025-10-07 13:09:39 +0000 UTC" firstStartedPulling="2025-10-07 13:09:41.53738316 +0000 UTC m=+153.023092275" lastFinishedPulling="2025-10-07 13:10:43.615517429 +0000 UTC m=+215.101226544" observedRunningTime="2025-10-07 13:10:44.174626678 +0000 UTC m=+215.660335793" watchObservedRunningTime="2025-10-07 13:10:44.175121912 +0000 UTC m=+215.660831027" Oct 07 13:10:44 crc kubenswrapper[4677]: I1007 13:10:44.175821 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlxmv" podStartSLOduration=2.938911719 podStartE2EDuration="1m7.175812642s" podCreationTimestamp="2025-10-07 13:09:37 +0000 UTC" firstStartedPulling="2025-10-07 13:09:39.467446023 +0000 UTC m=+150.953155138" lastFinishedPulling="2025-10-07 13:10:43.704346936 +0000 UTC m=+215.190056061" observedRunningTime="2025-10-07 13:10:44.152924099 +0000 UTC m=+215.638633234" watchObservedRunningTime="2025-10-07 13:10:44.175812642 +0000 UTC m=+215.661521757" Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.150998 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxpgl" event={"ID":"bcd3f7d5-1836-457c-b328-2dc358fd288c","Type":"ContainerStarted","Data":"22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970"} Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.153224 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frwdt" event={"ID":"8932abb2-d07f-45df-bd32-0ac930df1346","Type":"ContainerStarted","Data":"c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9"} Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.155408 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmcx7" event={"ID":"480ea359-f37d-4365-89c5-8f30e79f7c79","Type":"ContainerStarted","Data":"b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511"} Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.157311 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cjf7" event={"ID":"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387","Type":"ContainerStarted","Data":"87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56"} Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.194347 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-frwdt" podStartSLOduration=2.30082069 podStartE2EDuration="1m6.194323409s" podCreationTimestamp="2025-10-07 13:09:39 +0000 UTC" firstStartedPulling="2025-10-07 13:09:40.494273363 +0000 UTC m=+151.979982478" lastFinishedPulling="2025-10-07 13:10:44.387776072 +0000 UTC m=+215.873485197" observedRunningTime="2025-10-07 13:10:45.191800016 +0000 UTC m=+216.677509141" watchObservedRunningTime="2025-10-07 13:10:45.194323409 +0000 UTC m=+216.680032524" Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.196526 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dxpgl" podStartSLOduration=2.8124490680000003 podStartE2EDuration="1m5.196509863s" podCreationTimestamp="2025-10-07 13:09:40 +0000 UTC" firstStartedPulling="2025-10-07 13:09:42.618621355 +0000 UTC m=+154.104330470" lastFinishedPulling="2025-10-07 13:10:45.00268215 +0000 UTC m=+216.488391265" observedRunningTime="2025-10-07 13:10:45.174445633 +0000 UTC m=+216.660154758" watchObservedRunningTime="2025-10-07 13:10:45.196509863 +0000 UTC m=+216.682218988" Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.213127 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vmcx7" podStartSLOduration=3.806666611 podStartE2EDuration="1m8.213106504s" podCreationTimestamp="2025-10-07 13:09:37 +0000 UTC" firstStartedPulling="2025-10-07 13:09:39.444884586 +0000 UTC m=+150.930593701" lastFinishedPulling="2025-10-07 13:10:43.851324469 +0000 UTC m=+215.337033594" observedRunningTime="2025-10-07 13:10:45.21159003 +0000 UTC m=+216.697299175" watchObservedRunningTime="2025-10-07 13:10:45.213106504 +0000 UTC m=+216.698815639" Oct 07 13:10:45 crc kubenswrapper[4677]: I1007 13:10:45.232661 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5cjf7" podStartSLOduration=2.347523038 podStartE2EDuration="1m5.232643611s" podCreationTimestamp="2025-10-07 13:09:40 +0000 UTC" firstStartedPulling="2025-10-07 13:09:41.517340315 +0000 UTC m=+153.003049420" lastFinishedPulling="2025-10-07 13:10:44.402460888 +0000 UTC m=+215.888169993" observedRunningTime="2025-10-07 13:10:45.231708334 +0000 UTC m=+216.717417449" watchObservedRunningTime="2025-10-07 13:10:45.232643611 +0000 UTC m=+216.718352726" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.567742 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.608685 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.630573 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.630614 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.669480 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.906146 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.906185 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:10:47 crc kubenswrapper[4677]: I1007 13:10:47.954450 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:10:48 crc kubenswrapper[4677]: I1007 13:10:48.046901 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:10:48 crc kubenswrapper[4677]: I1007 13:10:48.046951 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:10:48 crc kubenswrapper[4677]: I1007 13:10:48.084072 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:10:48 crc kubenswrapper[4677]: I1007 13:10:48.213062 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:10:48 crc kubenswrapper[4677]: I1007 13:10:48.213614 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:10:49 crc kubenswrapper[4677]: I1007 13:10:49.470999 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:10:49 crc kubenswrapper[4677]: I1007 13:10:49.471294 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:10:49 crc kubenswrapper[4677]: I1007 13:10:49.517043 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:10:49 crc kubenswrapper[4677]: I1007 13:10:49.862187 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:10:49 crc kubenswrapper[4677]: I1007 13:10:49.862231 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:10:49 crc kubenswrapper[4677]: I1007 13:10:49.905845 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:10:50 crc kubenswrapper[4677]: I1007 13:10:50.220881 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:10:50 crc kubenswrapper[4677]: I1007 13:10:50.246893 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:10:50 crc kubenswrapper[4677]: I1007 13:10:50.858167 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:10:50 crc kubenswrapper[4677]: I1007 13:10:50.858220 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:10:50 crc kubenswrapper[4677]: I1007 13:10:50.914126 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.228842 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.248527 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.248579 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.295118 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.339933 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fr6vh"] Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.340313 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fr6vh" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="registry-server" containerID="cri-o://4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8" gracePeriod=2 Oct 07 13:10:51 crc kubenswrapper[4677]: E1007 13:10:51.403037 4677 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81da05e_fd1c_4a91_947d_f5d6958518d0.slice/crio-4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81da05e_fd1c_4a91_947d_f5d6958518d0.slice/crio-conmon-4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.665920 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.795710 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-utilities\") pod \"f81da05e-fd1c-4a91-947d-f5d6958518d0\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.795785 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-catalog-content\") pod \"f81da05e-fd1c-4a91-947d-f5d6958518d0\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.796419 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9fhn\" (UniqueName: \"kubernetes.io/projected/f81da05e-fd1c-4a91-947d-f5d6958518d0-kube-api-access-w9fhn\") pod \"f81da05e-fd1c-4a91-947d-f5d6958518d0\" (UID: \"f81da05e-fd1c-4a91-947d-f5d6958518d0\") " Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.797401 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-utilities" (OuterVolumeSpecName: "utilities") pod "f81da05e-fd1c-4a91-947d-f5d6958518d0" (UID: "f81da05e-fd1c-4a91-947d-f5d6958518d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.805333 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81da05e-fd1c-4a91-947d-f5d6958518d0-kube-api-access-w9fhn" (OuterVolumeSpecName: "kube-api-access-w9fhn") pod "f81da05e-fd1c-4a91-947d-f5d6958518d0" (UID: "f81da05e-fd1c-4a91-947d-f5d6958518d0"). InnerVolumeSpecName "kube-api-access-w9fhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.869179 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f81da05e-fd1c-4a91-947d-f5d6958518d0" (UID: "f81da05e-fd1c-4a91-947d-f5d6958518d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.898031 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9fhn\" (UniqueName: \"kubernetes.io/projected/f81da05e-fd1c-4a91-947d-f5d6958518d0-kube-api-access-w9fhn\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.898074 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.898083 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81da05e-fd1c-4a91-947d-f5d6958518d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:51 crc kubenswrapper[4677]: I1007 13:10:51.941720 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-84v59"] Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.200057 4677 generic.go:334] "Generic (PLEG): container finished" podID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerID="4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8" exitCode=0 Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.200359 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-84v59" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="registry-server" containerID="cri-o://01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176" gracePeriod=2 Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.201584 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fr6vh" event={"ID":"f81da05e-fd1c-4a91-947d-f5d6958518d0","Type":"ContainerDied","Data":"4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8"} Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.201690 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fr6vh" event={"ID":"f81da05e-fd1c-4a91-947d-f5d6958518d0","Type":"ContainerDied","Data":"34d8b5e42fe6dd51f159d108517dc50f89c4e2bdcc1c785ca2c33f7ee4088826"} Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.201691 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fr6vh" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.201706 4677 scope.go:117] "RemoveContainer" containerID="4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.232676 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fr6vh"] Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.234483 4677 scope.go:117] "RemoveContainer" containerID="e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.237085 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fr6vh"] Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.249023 4677 scope.go:117] "RemoveContainer" containerID="b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.262379 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.327884 4677 scope.go:117] "RemoveContainer" containerID="4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8" Oct 07 13:10:52 crc kubenswrapper[4677]: E1007 13:10:52.328302 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8\": container with ID starting with 4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8 not found: ID does not exist" containerID="4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.328334 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8"} err="failed to get container status \"4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8\": rpc error: code = NotFound desc = could not find container \"4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8\": container with ID starting with 4f8d5d24b13968f152ef16d1bac8769560525aab6e1d48461c454c4c843810e8 not found: ID does not exist" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.328354 4677 scope.go:117] "RemoveContainer" containerID="e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4" Oct 07 13:10:52 crc kubenswrapper[4677]: E1007 13:10:52.328732 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4\": container with ID starting with e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4 not found: ID does not exist" containerID="e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.328762 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4"} err="failed to get container status \"e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4\": rpc error: code = NotFound desc = could not find container \"e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4\": container with ID starting with e915e91ff82e1bbefb81b5404e62f83e05ff2d891e0fdf976d0263979b7004f4 not found: ID does not exist" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.328776 4677 scope.go:117] "RemoveContainer" containerID="b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a" Oct 07 13:10:52 crc kubenswrapper[4677]: E1007 13:10:52.329024 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a\": container with ID starting with b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a not found: ID does not exist" containerID="b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.329044 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a"} err="failed to get container status \"b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a\": rpc error: code = NotFound desc = could not find container \"b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a\": container with ID starting with b2f491102438133fcdffececc4b6ee6d9fc684f316d5b0e1e4764705311c7c9a not found: ID does not exist" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.580127 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.708809 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xmnr\" (UniqueName: \"kubernetes.io/projected/2defae4c-9ad5-42b5-89c7-100b68d49d6d-kube-api-access-5xmnr\") pod \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.708893 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-utilities\") pod \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.709075 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-catalog-content\") pod \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\" (UID: \"2defae4c-9ad5-42b5-89c7-100b68d49d6d\") " Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.709802 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-utilities" (OuterVolumeSpecName: "utilities") pod "2defae4c-9ad5-42b5-89c7-100b68d49d6d" (UID: "2defae4c-9ad5-42b5-89c7-100b68d49d6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.716578 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2defae4c-9ad5-42b5-89c7-100b68d49d6d-kube-api-access-5xmnr" (OuterVolumeSpecName: "kube-api-access-5xmnr") pod "2defae4c-9ad5-42b5-89c7-100b68d49d6d" (UID: "2defae4c-9ad5-42b5-89c7-100b68d49d6d"). InnerVolumeSpecName "kube-api-access-5xmnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.722932 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.722966 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xmnr\" (UniqueName: \"kubernetes.io/projected/2defae4c-9ad5-42b5-89c7-100b68d49d6d-kube-api-access-5xmnr\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.732366 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2defae4c-9ad5-42b5-89c7-100b68d49d6d" (UID: "2defae4c-9ad5-42b5-89c7-100b68d49d6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:52 crc kubenswrapper[4677]: I1007 13:10:52.824663 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2defae4c-9ad5-42b5-89c7-100b68d49d6d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.207783 4677 generic.go:334] "Generic (PLEG): container finished" podID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerID="01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176" exitCode=0 Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.207845 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84v59" event={"ID":"2defae4c-9ad5-42b5-89c7-100b68d49d6d","Type":"ContainerDied","Data":"01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176"} Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.208196 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84v59" event={"ID":"2defae4c-9ad5-42b5-89c7-100b68d49d6d","Type":"ContainerDied","Data":"928cfca758b48eaf9cfe70cbda95c7dfa3a149bb58c6997b293506e59952bd36"} Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.207891 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84v59" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.208234 4677 scope.go:117] "RemoveContainer" containerID="01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.221105 4677 scope.go:117] "RemoveContainer" containerID="26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.239583 4677 scope.go:117] "RemoveContainer" containerID="2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.251265 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-84v59"] Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.255034 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-84v59"] Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.258761 4677 scope.go:117] "RemoveContainer" containerID="01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176" Oct 07 13:10:53 crc kubenswrapper[4677]: E1007 13:10:53.259129 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176\": container with ID starting with 01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176 not found: ID does not exist" containerID="01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.259237 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176"} err="failed to get container status \"01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176\": rpc error: code = NotFound desc = could not find container \"01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176\": container with ID starting with 01b90c807a9cd29a836399981005951a673322ac338e39e4b7ccc89c8a31a176 not found: ID does not exist" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.259330 4677 scope.go:117] "RemoveContainer" containerID="26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395" Oct 07 13:10:53 crc kubenswrapper[4677]: E1007 13:10:53.259680 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395\": container with ID starting with 26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395 not found: ID does not exist" containerID="26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.259797 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395"} err="failed to get container status \"26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395\": rpc error: code = NotFound desc = could not find container \"26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395\": container with ID starting with 26a050179766f0d0b3985f618f29296487ec1587463d20c7948e6e1c8dffd395 not found: ID does not exist" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.259884 4677 scope.go:117] "RemoveContainer" containerID="2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea" Oct 07 13:10:53 crc kubenswrapper[4677]: E1007 13:10:53.260273 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea\": container with ID starting with 2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea not found: ID does not exist" containerID="2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.260343 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea"} err="failed to get container status \"2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea\": rpc error: code = NotFound desc = could not find container \"2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea\": container with ID starting with 2bba5ebcefbf6ba53a09e3d17645f9dd078b72699b67ac55d1904155d54d31ea not found: ID does not exist" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.309031 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" path="/var/lib/kubelet/pods/2defae4c-9ad5-42b5-89c7-100b68d49d6d/volumes" Oct 07 13:10:53 crc kubenswrapper[4677]: I1007 13:10:53.309820 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" path="/var/lib/kubelet/pods/f81da05e-fd1c-4a91-947d-f5d6958518d0/volumes" Oct 07 13:10:54 crc kubenswrapper[4677]: I1007 13:10:54.341236 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dxpgl"] Oct 07 13:10:54 crc kubenswrapper[4677]: I1007 13:10:54.342229 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dxpgl" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="registry-server" containerID="cri-o://22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970" gracePeriod=2 Oct 07 13:10:55 crc kubenswrapper[4677]: I1007 13:10:55.992616 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.082345 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-catalog-content\") pod \"bcd3f7d5-1836-457c-b328-2dc358fd288c\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.082450 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5dxn\" (UniqueName: \"kubernetes.io/projected/bcd3f7d5-1836-457c-b328-2dc358fd288c-kube-api-access-l5dxn\") pod \"bcd3f7d5-1836-457c-b328-2dc358fd288c\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.082512 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-utilities\") pod \"bcd3f7d5-1836-457c-b328-2dc358fd288c\" (UID: \"bcd3f7d5-1836-457c-b328-2dc358fd288c\") " Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.083330 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-utilities" (OuterVolumeSpecName: "utilities") pod "bcd3f7d5-1836-457c-b328-2dc358fd288c" (UID: "bcd3f7d5-1836-457c-b328-2dc358fd288c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.091104 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd3f7d5-1836-457c-b328-2dc358fd288c-kube-api-access-l5dxn" (OuterVolumeSpecName: "kube-api-access-l5dxn") pod "bcd3f7d5-1836-457c-b328-2dc358fd288c" (UID: "bcd3f7d5-1836-457c-b328-2dc358fd288c"). InnerVolumeSpecName "kube-api-access-l5dxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.183844 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5dxn\" (UniqueName: \"kubernetes.io/projected/bcd3f7d5-1836-457c-b328-2dc358fd288c-kube-api-access-l5dxn\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.183874 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.232226 4677 generic.go:334] "Generic (PLEG): container finished" podID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerID="22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970" exitCode=0 Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.232269 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxpgl" event={"ID":"bcd3f7d5-1836-457c-b328-2dc358fd288c","Type":"ContainerDied","Data":"22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970"} Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.232291 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxpgl" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.232619 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxpgl" event={"ID":"bcd3f7d5-1836-457c-b328-2dc358fd288c","Type":"ContainerDied","Data":"f7d4c7dbb6091f73a4e7b0819c847f407db4c9f2aed797bb417935d263204dd0"} Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.232650 4677 scope.go:117] "RemoveContainer" containerID="22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.256844 4677 scope.go:117] "RemoveContainer" containerID="93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.268782 4677 scope.go:117] "RemoveContainer" containerID="da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.281845 4677 scope.go:117] "RemoveContainer" containerID="22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970" Oct 07 13:10:56 crc kubenswrapper[4677]: E1007 13:10:56.282262 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970\": container with ID starting with 22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970 not found: ID does not exist" containerID="22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.282340 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970"} err="failed to get container status \"22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970\": rpc error: code = NotFound desc = could not find container \"22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970\": container with ID starting with 22ac431c61112af2a42d698b35df57c67977299846a820f5aec7736a5a659970 not found: ID does not exist" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.282378 4677 scope.go:117] "RemoveContainer" containerID="93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce" Oct 07 13:10:56 crc kubenswrapper[4677]: E1007 13:10:56.282730 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce\": container with ID starting with 93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce not found: ID does not exist" containerID="93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.282783 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce"} err="failed to get container status \"93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce\": rpc error: code = NotFound desc = could not find container \"93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce\": container with ID starting with 93b0574f8a0eefe6ce3a348d7fc62fe4902b0f6682453372741ba1096ffed8ce not found: ID does not exist" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.282817 4677 scope.go:117] "RemoveContainer" containerID="da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee" Oct 07 13:10:56 crc kubenswrapper[4677]: E1007 13:10:56.283147 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee\": container with ID starting with da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee not found: ID does not exist" containerID="da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.283180 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee"} err="failed to get container status \"da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee\": rpc error: code = NotFound desc = could not find container \"da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee\": container with ID starting with da353dea73aa6c9e7385e9c29dd54d65239f9cbf4d2ccb3659438754204ac1ee not found: ID does not exist" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.358459 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcd3f7d5-1836-457c-b328-2dc358fd288c" (UID: "bcd3f7d5-1836-457c-b328-2dc358fd288c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.385848 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd3f7d5-1836-457c-b328-2dc358fd288c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.559579 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dxpgl"] Oct 07 13:10:56 crc kubenswrapper[4677]: I1007 13:10:56.563120 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dxpgl"] Oct 07 13:10:57 crc kubenswrapper[4677]: I1007 13:10:57.308903 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" path="/var/lib/kubelet/pods/bcd3f7d5-1836-457c-b328-2dc358fd288c/volumes" Oct 07 13:10:58 crc kubenswrapper[4677]: I1007 13:10:58.092822 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:11:00 crc kubenswrapper[4677]: I1007 13:11:00.925393 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gspst"] Oct 07 13:11:01 crc kubenswrapper[4677]: I1007 13:11:01.742837 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmcx7"] Oct 07 13:11:01 crc kubenswrapper[4677]: I1007 13:11:01.743459 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vmcx7" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="registry-server" containerID="cri-o://b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511" gracePeriod=2 Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.107984 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.149618 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-utilities\") pod \"480ea359-f37d-4365-89c5-8f30e79f7c79\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.149672 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4tq9\" (UniqueName: \"kubernetes.io/projected/480ea359-f37d-4365-89c5-8f30e79f7c79-kube-api-access-l4tq9\") pod \"480ea359-f37d-4365-89c5-8f30e79f7c79\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.149759 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-catalog-content\") pod \"480ea359-f37d-4365-89c5-8f30e79f7c79\" (UID: \"480ea359-f37d-4365-89c5-8f30e79f7c79\") " Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.150923 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-utilities" (OuterVolumeSpecName: "utilities") pod "480ea359-f37d-4365-89c5-8f30e79f7c79" (UID: "480ea359-f37d-4365-89c5-8f30e79f7c79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.157601 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480ea359-f37d-4365-89c5-8f30e79f7c79-kube-api-access-l4tq9" (OuterVolumeSpecName: "kube-api-access-l4tq9") pod "480ea359-f37d-4365-89c5-8f30e79f7c79" (UID: "480ea359-f37d-4365-89c5-8f30e79f7c79"). InnerVolumeSpecName "kube-api-access-l4tq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.192278 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "480ea359-f37d-4365-89c5-8f30e79f7c79" (UID: "480ea359-f37d-4365-89c5-8f30e79f7c79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.251148 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.251205 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480ea359-f37d-4365-89c5-8f30e79f7c79-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.251231 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4tq9\" (UniqueName: \"kubernetes.io/projected/480ea359-f37d-4365-89c5-8f30e79f7c79-kube-api-access-l4tq9\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.268423 4677 generic.go:334] "Generic (PLEG): container finished" podID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerID="b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511" exitCode=0 Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.268497 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmcx7" event={"ID":"480ea359-f37d-4365-89c5-8f30e79f7c79","Type":"ContainerDied","Data":"b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511"} Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.268556 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmcx7" event={"ID":"480ea359-f37d-4365-89c5-8f30e79f7c79","Type":"ContainerDied","Data":"45c25972dc11a541beb62dad3af849a37513ba7084345333ccb161dfa7f4e54f"} Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.268579 4677 scope.go:117] "RemoveContainer" containerID="b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.268574 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmcx7" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.292337 4677 scope.go:117] "RemoveContainer" containerID="4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.309021 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmcx7"] Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.314803 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vmcx7"] Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.344616 4677 scope.go:117] "RemoveContainer" containerID="d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.360586 4677 scope.go:117] "RemoveContainer" containerID="b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511" Oct 07 13:11:02 crc kubenswrapper[4677]: E1007 13:11:02.360876 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511\": container with ID starting with b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511 not found: ID does not exist" containerID="b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.360908 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511"} err="failed to get container status \"b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511\": rpc error: code = NotFound desc = could not find container \"b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511\": container with ID starting with b4c657f22d1843ef9e3b2b2db344b10f74875c8e1e583c4fa80e62529d9b8511 not found: ID does not exist" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.360933 4677 scope.go:117] "RemoveContainer" containerID="4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f" Oct 07 13:11:02 crc kubenswrapper[4677]: E1007 13:11:02.361300 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f\": container with ID starting with 4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f not found: ID does not exist" containerID="4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.361328 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f"} err="failed to get container status \"4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f\": rpc error: code = NotFound desc = could not find container \"4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f\": container with ID starting with 4bce297fc5f80a8f7946d325710445ca4df58ea53310d93a8acad9710c114e4f not found: ID does not exist" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.361341 4677 scope.go:117] "RemoveContainer" containerID="d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2" Oct 07 13:11:02 crc kubenswrapper[4677]: E1007 13:11:02.361586 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2\": container with ID starting with d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2 not found: ID does not exist" containerID="d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2" Oct 07 13:11:02 crc kubenswrapper[4677]: I1007 13:11:02.361606 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2"} err="failed to get container status \"d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2\": rpc error: code = NotFound desc = could not find container \"d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2\": container with ID starting with d27ec6c67dbefada027f7031b82867e3c57eeccad0ea29e57519ebeb699002e2 not found: ID does not exist" Oct 07 13:11:03 crc kubenswrapper[4677]: I1007 13:11:03.309143 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" path="/var/lib/kubelet/pods/480ea359-f37d-4365-89c5-8f30e79f7c79/volumes" Oct 07 13:11:25 crc kubenswrapper[4677]: I1007 13:11:25.966670 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" podUID="bd1c8146-fe00-4a53-a102-17cfc6ef045b" containerName="oauth-openshift" containerID="cri-o://da5f756ed2ce40fda4eaff3fc908b803777660990da37cf7bb6c8460c2373e85" gracePeriod=15 Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.411863 4677 generic.go:334] "Generic (PLEG): container finished" podID="bd1c8146-fe00-4a53-a102-17cfc6ef045b" containerID="da5f756ed2ce40fda4eaff3fc908b803777660990da37cf7bb6c8460c2373e85" exitCode=0 Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.411987 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" event={"ID":"bd1c8146-fe00-4a53-a102-17cfc6ef045b","Type":"ContainerDied","Data":"da5f756ed2ce40fda4eaff3fc908b803777660990da37cf7bb6c8460c2373e85"} Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.412484 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" event={"ID":"bd1c8146-fe00-4a53-a102-17cfc6ef045b","Type":"ContainerDied","Data":"7936f89d94c94cac5451e640bb52c560dfb7ad0a9e476277b82d7070d637ad6a"} Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.412546 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7936f89d94c94cac5451e640bb52c560dfb7ad0a9e476277b82d7070d637ad6a" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.418249 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.460527 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86d85988f6-rq46k"] Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.460872 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.460894 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.460946 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.460963 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.460983 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.460996 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461015 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461027 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461046 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461057 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461077 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3" containerName="pruner" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461089 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3" containerName="pruner" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461106 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461121 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="extract-utilities" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461140 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1c8146-fe00-4a53-a102-17cfc6ef045b" containerName="oauth-openshift" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461156 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1c8146-fe00-4a53-a102-17cfc6ef045b" containerName="oauth-openshift" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461182 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461199 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461218 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b269f98-8c39-4496-b4ab-84147ace5e15" containerName="pruner" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461234 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b269f98-8c39-4496-b4ab-84147ace5e15" containerName="pruner" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461252 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461268 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461290 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461302 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461316 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461328 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461349 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461360 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: E1007 13:11:26.461381 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461393 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="extract-content" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461616 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1c8146-fe00-4a53-a102-17cfc6ef045b" containerName="oauth-openshift" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461639 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="2defae4c-9ad5-42b5-89c7-100b68d49d6d" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461663 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="480ea359-f37d-4365-89c5-8f30e79f7c79" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461679 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81da05e-fd1c-4a91-947d-f5d6958518d0" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461691 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe90fc09-1c81-4cbe-b00c-6f6004f1dcf3" containerName="pruner" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461707 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd3f7d5-1836-457c-b328-2dc358fd288c" containerName="registry-server" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.461721 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b269f98-8c39-4496-b4ab-84147ace5e15" containerName="pruner" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.462290 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.466839 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86d85988f6-rq46k"] Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.511119 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-service-ca\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.511201 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-login\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.511263 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-policies\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.511378 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-session\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.511465 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-router-certs\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.511564 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-ocp-branding-template\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512552 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-cliconfig\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512618 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-trusted-ca-bundle\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512658 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-error\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512693 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5xbw\" (UniqueName: \"kubernetes.io/projected/bd1c8146-fe00-4a53-a102-17cfc6ef045b-kube-api-access-d5xbw\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512737 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-idp-0-file-data\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512781 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-dir\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512031 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512183 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512816 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-provider-selection\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.512909 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-serving-cert\") pod \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\" (UID: \"bd1c8146-fe00-4a53-a102-17cfc6ef045b\") " Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513133 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513192 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-session\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513212 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513219 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6742cb6-7552-442f-8a00-8b016b7a268f-audit-dir\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513295 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513342 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-router-certs\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513376 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-error\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513408 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-login\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513461 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nkhk\" (UniqueName: \"kubernetes.io/projected/f6742cb6-7552-442f-8a00-8b016b7a268f-kube-api-access-5nkhk\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513493 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-audit-policies\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513508 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-service-ca\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513542 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513583 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513646 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513661 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513703 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513714 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.513723 4677 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.514157 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.514242 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.517154 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.518026 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1c8146-fe00-4a53-a102-17cfc6ef045b-kube-api-access-d5xbw" (OuterVolumeSpecName: "kube-api-access-d5xbw") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "kube-api-access-d5xbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.523678 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.524770 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.525267 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.525461 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.525605 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.534800 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.535137 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "bd1c8146-fe00-4a53-a102-17cfc6ef045b" (UID: "bd1c8146-fe00-4a53-a102-17cfc6ef045b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614581 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-audit-policies\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614661 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-service-ca\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614709 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614753 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614818 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614854 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614894 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614946 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-session\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.614978 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6742cb6-7552-442f-8a00-8b016b7a268f-audit-dir\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615010 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615049 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-router-certs\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615088 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-error\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615125 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-login\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615166 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nkhk\" (UniqueName: \"kubernetes.io/projected/f6742cb6-7552-442f-8a00-8b016b7a268f-kube-api-access-5nkhk\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615226 4677 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd1c8146-fe00-4a53-a102-17cfc6ef045b-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615247 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615268 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615286 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615305 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615374 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615398 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615419 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615465 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-service-ca\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615480 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615501 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5xbw\" (UniqueName: \"kubernetes.io/projected/bd1c8146-fe00-4a53-a102-17cfc6ef045b-kube-api-access-d5xbw\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615520 4677 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd1c8146-fe00-4a53-a102-17cfc6ef045b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.615602 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6742cb6-7552-442f-8a00-8b016b7a268f-audit-dir\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.616284 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-audit-policies\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.616972 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.616974 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.619366 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.620174 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-error\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.620791 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-router-certs\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.629474 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-session\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.629540 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.629962 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-login\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.629973 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.630399 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6742cb6-7552-442f-8a00-8b016b7a268f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.633310 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nkhk\" (UniqueName: \"kubernetes.io/projected/f6742cb6-7552-442f-8a00-8b016b7a268f-kube-api-access-5nkhk\") pod \"oauth-openshift-86d85988f6-rq46k\" (UID: \"f6742cb6-7552-442f-8a00-8b016b7a268f\") " pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:26 crc kubenswrapper[4677]: I1007 13:11:26.789746 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.062627 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86d85988f6-rq46k"] Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.422586 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" event={"ID":"f6742cb6-7552-442f-8a00-8b016b7a268f","Type":"ContainerStarted","Data":"aa78ab4f05f99d6afda2d2a7be2e52f793c26aec0b0c89498a1e3f580f4b4781"} Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.422999 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.423025 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" event={"ID":"f6742cb6-7552-442f-8a00-8b016b7a268f","Type":"ContainerStarted","Data":"9f517198244e74ef952570d7d2b44ef7325d2e01e84c691c044b6c214e208f76"} Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.422645 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gspst" Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.447073 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" podStartSLOduration=27.447050392 podStartE2EDuration="27.447050392s" podCreationTimestamp="2025-10-07 13:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:11:27.443221581 +0000 UTC m=+258.928930706" watchObservedRunningTime="2025-10-07 13:11:27.447050392 +0000 UTC m=+258.932759507" Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.459018 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gspst"] Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.468202 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gspst"] Oct 07 13:11:27 crc kubenswrapper[4677]: I1007 13:11:27.801547 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86d85988f6-rq46k" Oct 07 13:11:29 crc kubenswrapper[4677]: I1007 13:11:29.314848 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1c8146-fe00-4a53-a102-17cfc6ef045b" path="/var/lib/kubelet/pods/bd1c8146-fe00-4a53-a102-17cfc6ef045b/volumes" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.421283 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27x4s"] Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.423673 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-27x4s" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="registry-server" containerID="cri-o://07b90718fa1794fc4a0ca8352d664a3f198762e762f6dc658ab972880a1c62cb" gracePeriod=30 Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.442092 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlxmv"] Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.442366 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hlxmv" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="registry-server" containerID="cri-o://443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8" gracePeriod=30 Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.449481 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r6nq5"] Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.450184 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" podUID="9122c8d7-acc8-4ed0-81b0-79ea36536943" containerName="marketplace-operator" containerID="cri-o://d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9" gracePeriod=30 Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.454410 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frwdt"] Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.454931 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-frwdt" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="registry-server" containerID="cri-o://c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9" gracePeriod=30 Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.459474 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5cjf7"] Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.459789 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5cjf7" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="registry-server" containerID="cri-o://87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56" gracePeriod=30 Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.463979 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbppr"] Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.464827 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.515839 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbppr"] Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.567620 4677 generic.go:334] "Generic (PLEG): container finished" podID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerID="07b90718fa1794fc4a0ca8352d664a3f198762e762f6dc658ab972880a1c62cb" exitCode=0 Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.567658 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27x4s" event={"ID":"57d9fc75-7df6-4205-9600-0e0d0ff04f8a","Type":"ContainerDied","Data":"07b90718fa1794fc4a0ca8352d664a3f198762e762f6dc658ab972880a1c62cb"} Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.632347 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptzw\" (UniqueName: \"kubernetes.io/projected/c8872e7f-608d-4ade-8466-e1e743417ece-kube-api-access-7ptzw\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.632410 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8872e7f-608d-4ade-8466-e1e743417ece-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.632456 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8872e7f-608d-4ade-8466-e1e743417ece-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.733971 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptzw\" (UniqueName: \"kubernetes.io/projected/c8872e7f-608d-4ade-8466-e1e743417ece-kube-api-access-7ptzw\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.734029 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8872e7f-608d-4ade-8466-e1e743417ece-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.734064 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8872e7f-608d-4ade-8466-e1e743417ece-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.735893 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8872e7f-608d-4ade-8466-e1e743417ece-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.739192 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8872e7f-608d-4ade-8466-e1e743417ece-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.751784 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptzw\" (UniqueName: \"kubernetes.io/projected/c8872e7f-608d-4ade-8466-e1e743417ece-kube-api-access-7ptzw\") pod \"marketplace-operator-79b997595-zbppr\" (UID: \"c8872e7f-608d-4ade-8466-e1e743417ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.781692 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.877193 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.956853 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.984467 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:11:48 crc kubenswrapper[4677]: I1007 13:11:48.992666 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.008917 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.037667 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-catalog-content\") pod \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.037773 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-utilities\") pod \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.038498 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs5rg\" (UniqueName: \"kubernetes.io/projected/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-kube-api-access-xs5rg\") pod \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\" (UID: \"c9473eea-460d-4148-8b4f-f2e0ccba3b2e\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.040046 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-utilities" (OuterVolumeSpecName: "utilities") pod "c9473eea-460d-4148-8b4f-f2e0ccba3b2e" (UID: "c9473eea-460d-4148-8b4f-f2e0ccba3b2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.042206 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-kube-api-access-xs5rg" (OuterVolumeSpecName: "kube-api-access-xs5rg") pod "c9473eea-460d-4148-8b4f-f2e0ccba3b2e" (UID: "c9473eea-460d-4148-8b4f-f2e0ccba3b2e"). InnerVolumeSpecName "kube-api-access-xs5rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.077763 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbppr"] Oct 07 13:11:49 crc kubenswrapper[4677]: W1007 13:11:49.083970 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8872e7f_608d_4ade_8466_e1e743417ece.slice/crio-6ba6ae0c4b34c638dc206b563c36ce32805b76744dc793cec709b3c4bd948dfa WatchSource:0}: Error finding container 6ba6ae0c4b34c638dc206b563c36ce32805b76744dc793cec709b3c4bd948dfa: Status 404 returned error can't find the container with id 6ba6ae0c4b34c638dc206b563c36ce32805b76744dc793cec709b3c4bd948dfa Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.117334 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9473eea-460d-4148-8b4f-f2e0ccba3b2e" (UID: "c9473eea-460d-4148-8b4f-f2e0ccba3b2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139411 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wn5b\" (UniqueName: \"kubernetes.io/projected/8932abb2-d07f-45df-bd32-0ac930df1346-kube-api-access-6wn5b\") pod \"8932abb2-d07f-45df-bd32-0ac930df1346\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139486 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-catalog-content\") pod \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139523 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-catalog-content\") pod \"8932abb2-d07f-45df-bd32-0ac930df1346\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139558 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-utilities\") pod \"8932abb2-d07f-45df-bd32-0ac930df1346\" (UID: \"8932abb2-d07f-45df-bd32-0ac930df1346\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139580 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-utilities\") pod \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139599 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5qxl\" (UniqueName: \"kubernetes.io/projected/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-kube-api-access-h5qxl\") pod \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139618 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-catalog-content\") pod \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\" (UID: \"57d9fc75-7df6-4205-9600-0e0d0ff04f8a\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139639 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-operator-metrics\") pod \"9122c8d7-acc8-4ed0-81b0-79ea36536943\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139668 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn6pm\" (UniqueName: \"kubernetes.io/projected/9122c8d7-acc8-4ed0-81b0-79ea36536943-kube-api-access-xn6pm\") pod \"9122c8d7-acc8-4ed0-81b0-79ea36536943\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139701 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-trusted-ca\") pod \"9122c8d7-acc8-4ed0-81b0-79ea36536943\" (UID: \"9122c8d7-acc8-4ed0-81b0-79ea36536943\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139721 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwt44\" (UniqueName: \"kubernetes.io/projected/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-kube-api-access-lwt44\") pod \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139744 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-utilities\") pod \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\" (UID: \"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387\") " Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139961 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs5rg\" (UniqueName: \"kubernetes.io/projected/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-kube-api-access-xs5rg\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139980 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.139992 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9473eea-460d-4148-8b4f-f2e0ccba3b2e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.140700 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-utilities" (OuterVolumeSpecName: "utilities") pod "1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" (UID: "1ecf8a0f-0b5b-42c2-80c5-cb0a82421387"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.141267 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "9122c8d7-acc8-4ed0-81b0-79ea36536943" (UID: "9122c8d7-acc8-4ed0-81b0-79ea36536943"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.141986 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-utilities" (OuterVolumeSpecName: "utilities") pod "8932abb2-d07f-45df-bd32-0ac930df1346" (UID: "8932abb2-d07f-45df-bd32-0ac930df1346"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.142116 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-utilities" (OuterVolumeSpecName: "utilities") pod "57d9fc75-7df6-4205-9600-0e0d0ff04f8a" (UID: "57d9fc75-7df6-4205-9600-0e0d0ff04f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.145191 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8932abb2-d07f-45df-bd32-0ac930df1346-kube-api-access-6wn5b" (OuterVolumeSpecName: "kube-api-access-6wn5b") pod "8932abb2-d07f-45df-bd32-0ac930df1346" (UID: "8932abb2-d07f-45df-bd32-0ac930df1346"). InnerVolumeSpecName "kube-api-access-6wn5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.145259 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-kube-api-access-h5qxl" (OuterVolumeSpecName: "kube-api-access-h5qxl") pod "57d9fc75-7df6-4205-9600-0e0d0ff04f8a" (UID: "57d9fc75-7df6-4205-9600-0e0d0ff04f8a"). InnerVolumeSpecName "kube-api-access-h5qxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.156575 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-kube-api-access-lwt44" (OuterVolumeSpecName: "kube-api-access-lwt44") pod "1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" (UID: "1ecf8a0f-0b5b-42c2-80c5-cb0a82421387"). InnerVolumeSpecName "kube-api-access-lwt44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.156778 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9122c8d7-acc8-4ed0-81b0-79ea36536943-kube-api-access-xn6pm" (OuterVolumeSpecName: "kube-api-access-xn6pm") pod "9122c8d7-acc8-4ed0-81b0-79ea36536943" (UID: "9122c8d7-acc8-4ed0-81b0-79ea36536943"). InnerVolumeSpecName "kube-api-access-xn6pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.157897 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8932abb2-d07f-45df-bd32-0ac930df1346" (UID: "8932abb2-d07f-45df-bd32-0ac930df1346"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.158691 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "9122c8d7-acc8-4ed0-81b0-79ea36536943" (UID: "9122c8d7-acc8-4ed0-81b0-79ea36536943"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.190650 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57d9fc75-7df6-4205-9600-0e0d0ff04f8a" (UID: "57d9fc75-7df6-4205-9600-0e0d0ff04f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.228294 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" (UID: "1ecf8a0f-0b5b-42c2-80c5-cb0a82421387"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241347 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wn5b\" (UniqueName: \"kubernetes.io/projected/8932abb2-d07f-45df-bd32-0ac930df1346-kube-api-access-6wn5b\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241386 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241396 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241405 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8932abb2-d07f-45df-bd32-0ac930df1346-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241415 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241423 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5qxl\" (UniqueName: \"kubernetes.io/projected/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-kube-api-access-h5qxl\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241471 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d9fc75-7df6-4205-9600-0e0d0ff04f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241482 4677 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241491 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn6pm\" (UniqueName: \"kubernetes.io/projected/9122c8d7-acc8-4ed0-81b0-79ea36536943-kube-api-access-xn6pm\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241499 4677 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9122c8d7-acc8-4ed0-81b0-79ea36536943-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241507 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwt44\" (UniqueName: \"kubernetes.io/projected/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-kube-api-access-lwt44\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.241514 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.572975 4677 generic.go:334] "Generic (PLEG): container finished" podID="9122c8d7-acc8-4ed0-81b0-79ea36536943" containerID="d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9" exitCode=0 Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.573054 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" event={"ID":"9122c8d7-acc8-4ed0-81b0-79ea36536943","Type":"ContainerDied","Data":"d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.573096 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.573116 4677 scope.go:117] "RemoveContainer" containerID="d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.573101 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r6nq5" event={"ID":"9122c8d7-acc8-4ed0-81b0-79ea36536943","Type":"ContainerDied","Data":"97106e1decb2379e7ff949abee25dcf9c3c6f5c69a622dbade3fad042fa25533"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.575814 4677 generic.go:334] "Generic (PLEG): container finished" podID="8932abb2-d07f-45df-bd32-0ac930df1346" containerID="c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9" exitCode=0 Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.575916 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frwdt" event={"ID":"8932abb2-d07f-45df-bd32-0ac930df1346","Type":"ContainerDied","Data":"c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.575948 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frwdt" event={"ID":"8932abb2-d07f-45df-bd32-0ac930df1346","Type":"ContainerDied","Data":"b801dee5110267ee9a212216728b1d282f1b0f29ef826ca977476e21ad2a2d22"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.576026 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frwdt" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.581541 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27x4s" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.581586 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27x4s" event={"ID":"57d9fc75-7df6-4205-9600-0e0d0ff04f8a","Type":"ContainerDied","Data":"56ae3a9e16ed281aa2dc4653fe272ead892537929b5a8125d10ea02e5066638d"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.585154 4677 generic.go:334] "Generic (PLEG): container finished" podID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerID="87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56" exitCode=0 Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.585205 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cjf7" event={"ID":"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387","Type":"ContainerDied","Data":"87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.585230 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cjf7" event={"ID":"1ecf8a0f-0b5b-42c2-80c5-cb0a82421387","Type":"ContainerDied","Data":"59cff51fb4fbc9e7a8234bf09b605a6e134542e1dd127ec71ea988ad652d1784"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.585318 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cjf7" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.588468 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" event={"ID":"c8872e7f-608d-4ade-8466-e1e743417ece","Type":"ContainerStarted","Data":"3388b2bec7c0b44c7420da80d63e913a530d970e512d8353b78cf0a8592be6dd"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.588512 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" event={"ID":"c8872e7f-608d-4ade-8466-e1e743417ece","Type":"ContainerStarted","Data":"6ba6ae0c4b34c638dc206b563c36ce32805b76744dc793cec709b3c4bd948dfa"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.589062 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.593204 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.596261 4677 generic.go:334] "Generic (PLEG): container finished" podID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerID="443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8" exitCode=0 Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.596297 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlxmv" event={"ID":"c9473eea-460d-4148-8b4f-f2e0ccba3b2e","Type":"ContainerDied","Data":"443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.596351 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlxmv" event={"ID":"c9473eea-460d-4148-8b4f-f2e0ccba3b2e","Type":"ContainerDied","Data":"96867bd0aea0617d04ab569896a67cae7c05aedb53327c99adfc7a7881b82872"} Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.596742 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlxmv" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.608198 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zbppr" podStartSLOduration=1.6081816899999999 podStartE2EDuration="1.60818169s" podCreationTimestamp="2025-10-07 13:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:11:49.604406751 +0000 UTC m=+281.090115866" watchObservedRunningTime="2025-10-07 13:11:49.60818169 +0000 UTC m=+281.093890815" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.620915 4677 scope.go:117] "RemoveContainer" containerID="d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.621464 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9\": container with ID starting with d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9 not found: ID does not exist" containerID="d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.621519 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9"} err="failed to get container status \"d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9\": rpc error: code = NotFound desc = could not find container \"d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9\": container with ID starting with d4f926410d977918318838fdb2e35eaa227cf01c40979973354c564ae71855d9 not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.621553 4677 scope.go:117] "RemoveContainer" containerID="c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.640489 4677 scope.go:117] "RemoveContainer" containerID="d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.660606 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlxmv"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.695368 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hlxmv"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.709737 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27x4s"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.712543 4677 scope.go:117] "RemoveContainer" containerID="75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.713187 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-27x4s"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.714907 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r6nq5"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.719822 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r6nq5"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.727403 4677 scope.go:117] "RemoveContainer" containerID="c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.727797 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9\": container with ID starting with c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9 not found: ID does not exist" containerID="c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.727834 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9"} err="failed to get container status \"c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9\": rpc error: code = NotFound desc = could not find container \"c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9\": container with ID starting with c6063234bdae4039c7132918b951101182c614a1997ec7942328e7498b689ef9 not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.727857 4677 scope.go:117] "RemoveContainer" containerID="d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.728113 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083\": container with ID starting with d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083 not found: ID does not exist" containerID="d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.728143 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083"} err="failed to get container status \"d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083\": rpc error: code = NotFound desc = could not find container \"d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083\": container with ID starting with d74e7174596e235fcd73eb7fa4385f5e2e9b2c502ba5e08d68ef20dea5132083 not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.728166 4677 scope.go:117] "RemoveContainer" containerID="75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.728577 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da\": container with ID starting with 75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da not found: ID does not exist" containerID="75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.728597 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da"} err="failed to get container status \"75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da\": rpc error: code = NotFound desc = could not find container \"75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da\": container with ID starting with 75fd2eed622829fa48ebf03f86dab5fa4d72558b6979bab16fa44a6df09a89da not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.728611 4677 scope.go:117] "RemoveContainer" containerID="07b90718fa1794fc4a0ca8352d664a3f198762e762f6dc658ab972880a1c62cb" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.730727 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5cjf7"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.732894 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5cjf7"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.742529 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frwdt"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.747160 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-frwdt"] Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.752367 4677 scope.go:117] "RemoveContainer" containerID="37a6f1bef96a33c09a37a5861bd31c7b32f9d9c7832b94773f92af63867fdbb4" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.765410 4677 scope.go:117] "RemoveContainer" containerID="9d23d80be02384015c4165dd8531b5a703386c8e6a048345d1c600ede802ec23" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.781928 4677 scope.go:117] "RemoveContainer" containerID="87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.798187 4677 scope.go:117] "RemoveContainer" containerID="cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.809626 4677 scope.go:117] "RemoveContainer" containerID="49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.822269 4677 scope.go:117] "RemoveContainer" containerID="87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.822709 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56\": container with ID starting with 87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56 not found: ID does not exist" containerID="87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.822769 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56"} err="failed to get container status \"87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56\": rpc error: code = NotFound desc = could not find container \"87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56\": container with ID starting with 87dcb21eaf9c5439f9d4a6390584bfcb0471d21b0b2cf7184e4b6d78b5b23c56 not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.822813 4677 scope.go:117] "RemoveContainer" containerID="cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.823080 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc\": container with ID starting with cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc not found: ID does not exist" containerID="cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.823107 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc"} err="failed to get container status \"cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc\": rpc error: code = NotFound desc = could not find container \"cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc\": container with ID starting with cadbd08b91cbe1940e3a928624cf6cbea690e9545aaca2b5f54fc057ca8835cc not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.823124 4677 scope.go:117] "RemoveContainer" containerID="49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.823365 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2\": container with ID starting with 49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2 not found: ID does not exist" containerID="49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.823397 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2"} err="failed to get container status \"49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2\": rpc error: code = NotFound desc = could not find container \"49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2\": container with ID starting with 49bb7520b791d734aa657d3da31e6fc7b72e2df992b8a171908a3152eecb1be2 not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.823422 4677 scope.go:117] "RemoveContainer" containerID="443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.837088 4677 scope.go:117] "RemoveContainer" containerID="760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.855332 4677 scope.go:117] "RemoveContainer" containerID="293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.874095 4677 scope.go:117] "RemoveContainer" containerID="443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.875590 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8\": container with ID starting with 443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8 not found: ID does not exist" containerID="443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.875634 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8"} err="failed to get container status \"443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8\": rpc error: code = NotFound desc = could not find container \"443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8\": container with ID starting with 443eb657bd36b30993b21e49cb7afe7da1f3749016a99007eff3883295cfc1c8 not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.875665 4677 scope.go:117] "RemoveContainer" containerID="760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.876192 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7\": container with ID starting with 760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7 not found: ID does not exist" containerID="760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.876245 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7"} err="failed to get container status \"760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7\": rpc error: code = NotFound desc = could not find container \"760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7\": container with ID starting with 760aab4344ca60eff10acea71401c154ecbb32e6eb82e97c465877cca8649bf7 not found: ID does not exist" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.876280 4677 scope.go:117] "RemoveContainer" containerID="293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac" Oct 07 13:11:49 crc kubenswrapper[4677]: E1007 13:11:49.876583 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac\": container with ID starting with 293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac not found: ID does not exist" containerID="293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac" Oct 07 13:11:49 crc kubenswrapper[4677]: I1007 13:11:49.876620 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac"} err="failed to get container status \"293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac\": rpc error: code = NotFound desc = could not find container \"293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac\": container with ID starting with 293d0e71613e7cd3086a9b53010356ef3b1dd4d59c4bc031f7ee183df98874ac not found: ID does not exist" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.644205 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r7528"] Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.644672 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.644686 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.644698 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.644709 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.644725 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.644732 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645076 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645088 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645099 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645107 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645118 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645126 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645138 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9122c8d7-acc8-4ed0-81b0-79ea36536943" containerName="marketplace-operator" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645146 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="9122c8d7-acc8-4ed0-81b0-79ea36536943" containerName="marketplace-operator" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645157 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645166 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645175 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645183 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645194 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645203 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645214 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645223 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="extract-utilities" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645231 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645239 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: E1007 13:11:50.645249 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645259 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="extract-content" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645369 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645383 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645398 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645408 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" containerName="registry-server" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.645420 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="9122c8d7-acc8-4ed0-81b0-79ea36536943" containerName="marketplace-operator" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.646240 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.648534 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.660245 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r7528"] Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.761313 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422fbc9b-de01-4f7f-8be7-0036569e6dbb-utilities\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.761397 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422fbc9b-de01-4f7f-8be7-0036569e6dbb-catalog-content\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.761587 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5hc\" (UniqueName: \"kubernetes.io/projected/422fbc9b-de01-4f7f-8be7-0036569e6dbb-kube-api-access-dj5hc\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.859716 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pm2jz"] Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.862177 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.863300 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422fbc9b-de01-4f7f-8be7-0036569e6dbb-utilities\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.863343 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422fbc9b-de01-4f7f-8be7-0036569e6dbb-catalog-content\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.863445 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5hc\" (UniqueName: \"kubernetes.io/projected/422fbc9b-de01-4f7f-8be7-0036569e6dbb-kube-api-access-dj5hc\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.864644 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422fbc9b-de01-4f7f-8be7-0036569e6dbb-catalog-content\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.865001 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pm2jz"] Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.865264 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.868985 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422fbc9b-de01-4f7f-8be7-0036569e6dbb-utilities\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.880307 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5hc\" (UniqueName: \"kubernetes.io/projected/422fbc9b-de01-4f7f-8be7-0036569e6dbb-kube-api-access-dj5hc\") pod \"redhat-marketplace-r7528\" (UID: \"422fbc9b-de01-4f7f-8be7-0036569e6dbb\") " pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.964219 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e628b85d-09f9-4559-9853-b35c19a0e0e6-catalog-content\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.964403 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e628b85d-09f9-4559-9853-b35c19a0e0e6-utilities\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.964611 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpdl\" (UniqueName: \"kubernetes.io/projected/e628b85d-09f9-4559-9853-b35c19a0e0e6-kube-api-access-4tpdl\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:50 crc kubenswrapper[4677]: I1007 13:11:50.981382 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.066220 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e628b85d-09f9-4559-9853-b35c19a0e0e6-catalog-content\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.066625 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e628b85d-09f9-4559-9853-b35c19a0e0e6-utilities\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.066667 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpdl\" (UniqueName: \"kubernetes.io/projected/e628b85d-09f9-4559-9853-b35c19a0e0e6-kube-api-access-4tpdl\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.067179 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e628b85d-09f9-4559-9853-b35c19a0e0e6-catalog-content\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.067311 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e628b85d-09f9-4559-9853-b35c19a0e0e6-utilities\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.088855 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpdl\" (UniqueName: \"kubernetes.io/projected/e628b85d-09f9-4559-9853-b35c19a0e0e6-kube-api-access-4tpdl\") pod \"redhat-operators-pm2jz\" (UID: \"e628b85d-09f9-4559-9853-b35c19a0e0e6\") " pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.195209 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.313658 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ecf8a0f-0b5b-42c2-80c5-cb0a82421387" path="/var/lib/kubelet/pods/1ecf8a0f-0b5b-42c2-80c5-cb0a82421387/volumes" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.314619 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d9fc75-7df6-4205-9600-0e0d0ff04f8a" path="/var/lib/kubelet/pods/57d9fc75-7df6-4205-9600-0e0d0ff04f8a/volumes" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.315458 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8932abb2-d07f-45df-bd32-0ac930df1346" path="/var/lib/kubelet/pods/8932abb2-d07f-45df-bd32-0ac930df1346/volumes" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.316741 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9122c8d7-acc8-4ed0-81b0-79ea36536943" path="/var/lib/kubelet/pods/9122c8d7-acc8-4ed0-81b0-79ea36536943/volumes" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.317278 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9473eea-460d-4148-8b4f-f2e0ccba3b2e" path="/var/lib/kubelet/pods/c9473eea-460d-4148-8b4f-f2e0ccba3b2e/volumes" Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.357869 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r7528"] Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.602856 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pm2jz"] Oct 07 13:11:51 crc kubenswrapper[4677]: W1007 13:11:51.604493 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode628b85d_09f9_4559_9853_b35c19a0e0e6.slice/crio-04b41a66d4bd1b7e7fa5fbae37b961879651e9a6106388363cfe1792d459e1d2 WatchSource:0}: Error finding container 04b41a66d4bd1b7e7fa5fbae37b961879651e9a6106388363cfe1792d459e1d2: Status 404 returned error can't find the container with id 04b41a66d4bd1b7e7fa5fbae37b961879651e9a6106388363cfe1792d459e1d2 Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.614571 4677 generic.go:334] "Generic (PLEG): container finished" podID="422fbc9b-de01-4f7f-8be7-0036569e6dbb" containerID="19190043072d79045899f6c03eee66e960120b87e4d83c2f371e507eef32ec20" exitCode=0 Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.614679 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7528" event={"ID":"422fbc9b-de01-4f7f-8be7-0036569e6dbb","Type":"ContainerDied","Data":"19190043072d79045899f6c03eee66e960120b87e4d83c2f371e507eef32ec20"} Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.614751 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7528" event={"ID":"422fbc9b-de01-4f7f-8be7-0036569e6dbb","Type":"ContainerStarted","Data":"b2c7eaea9bb828b313f0501c16abe09e36fef35929b69bf256ef6d6891b12be6"} Oct 07 13:11:51 crc kubenswrapper[4677]: I1007 13:11:51.616346 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm2jz" event={"ID":"e628b85d-09f9-4559-9853-b35c19a0e0e6","Type":"ContainerStarted","Data":"04b41a66d4bd1b7e7fa5fbae37b961879651e9a6106388363cfe1792d459e1d2"} Oct 07 13:11:52 crc kubenswrapper[4677]: I1007 13:11:52.627294 4677 generic.go:334] "Generic (PLEG): container finished" podID="e628b85d-09f9-4559-9853-b35c19a0e0e6" containerID="5be1148d1a8d3622e4c83cbddf156c93a0d8d1f1edd591a5713dbdc4585c620d" exitCode=0 Oct 07 13:11:52 crc kubenswrapper[4677]: I1007 13:11:52.627419 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm2jz" event={"ID":"e628b85d-09f9-4559-9853-b35c19a0e0e6","Type":"ContainerDied","Data":"5be1148d1a8d3622e4c83cbddf156c93a0d8d1f1edd591a5713dbdc4585c620d"} Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.046123 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pjgk2"] Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.047977 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.049824 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.058694 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjgk2"] Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.192344 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxpb4\" (UniqueName: \"kubernetes.io/projected/a97dd5c5-2e20-4155-8829-acd24af4fe9f-kube-api-access-nxpb4\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.192451 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a97dd5c5-2e20-4155-8829-acd24af4fe9f-utilities\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.192487 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a97dd5c5-2e20-4155-8829-acd24af4fe9f-catalog-content\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.256666 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n428h"] Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.259629 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.270138 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.277128 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n428h"] Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.298236 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxpb4\" (UniqueName: \"kubernetes.io/projected/a97dd5c5-2e20-4155-8829-acd24af4fe9f-kube-api-access-nxpb4\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.298925 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a97dd5c5-2e20-4155-8829-acd24af4fe9f-utilities\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.299205 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a97dd5c5-2e20-4155-8829-acd24af4fe9f-catalog-content\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.299733 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a97dd5c5-2e20-4155-8829-acd24af4fe9f-catalog-content\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.300133 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a97dd5c5-2e20-4155-8829-acd24af4fe9f-utilities\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.317506 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxpb4\" (UniqueName: \"kubernetes.io/projected/a97dd5c5-2e20-4155-8829-acd24af4fe9f-kube-api-access-nxpb4\") pod \"certified-operators-pjgk2\" (UID: \"a97dd5c5-2e20-4155-8829-acd24af4fe9f\") " pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.369465 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.400177 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad380640-a5ab-4fb9-838b-28b4732c597e-catalog-content\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.400224 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmsvj\" (UniqueName: \"kubernetes.io/projected/ad380640-a5ab-4fb9-838b-28b4732c597e-kube-api-access-cmsvj\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.400265 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad380640-a5ab-4fb9-838b-28b4732c597e-utilities\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.501724 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad380640-a5ab-4fb9-838b-28b4732c597e-catalog-content\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.502122 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmsvj\" (UniqueName: \"kubernetes.io/projected/ad380640-a5ab-4fb9-838b-28b4732c597e-kube-api-access-cmsvj\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.502203 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad380640-a5ab-4fb9-838b-28b4732c597e-utilities\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.502634 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad380640-a5ab-4fb9-838b-28b4732c597e-catalog-content\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.502846 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad380640-a5ab-4fb9-838b-28b4732c597e-utilities\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.519120 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmsvj\" (UniqueName: \"kubernetes.io/projected/ad380640-a5ab-4fb9-838b-28b4732c597e-kube-api-access-cmsvj\") pod \"community-operators-n428h\" (UID: \"ad380640-a5ab-4fb9-838b-28b4732c597e\") " pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.554954 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjgk2"] Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.587866 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n428h" Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.634403 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjgk2" event={"ID":"a97dd5c5-2e20-4155-8829-acd24af4fe9f","Type":"ContainerStarted","Data":"c26c2370fd233f393004f72595b2b2876b083656427968324ba893679301f631"} Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.636532 4677 generic.go:334] "Generic (PLEG): container finished" podID="422fbc9b-de01-4f7f-8be7-0036569e6dbb" containerID="968d38ee0d8ec1bee91b882718f6f111f03dec672801e3b3b168a58651e2a971" exitCode=0 Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.636566 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7528" event={"ID":"422fbc9b-de01-4f7f-8be7-0036569e6dbb","Type":"ContainerDied","Data":"968d38ee0d8ec1bee91b882718f6f111f03dec672801e3b3b168a58651e2a971"} Oct 07 13:11:53 crc kubenswrapper[4677]: I1007 13:11:53.789135 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n428h"] Oct 07 13:11:53 crc kubenswrapper[4677]: W1007 13:11:53.798279 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad380640_a5ab_4fb9_838b_28b4732c597e.slice/crio-68276fe87e1523f77fcb57d67ff67b8055ff01fa3490f65ed1d174ef98575b5f WatchSource:0}: Error finding container 68276fe87e1523f77fcb57d67ff67b8055ff01fa3490f65ed1d174ef98575b5f: Status 404 returned error can't find the container with id 68276fe87e1523f77fcb57d67ff67b8055ff01fa3490f65ed1d174ef98575b5f Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.644626 4677 generic.go:334] "Generic (PLEG): container finished" podID="a97dd5c5-2e20-4155-8829-acd24af4fe9f" containerID="a2641110338619d028504d00c40d4b0261504b860faca642ed117785e9295d46" exitCode=0 Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.644722 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjgk2" event={"ID":"a97dd5c5-2e20-4155-8829-acd24af4fe9f","Type":"ContainerDied","Data":"a2641110338619d028504d00c40d4b0261504b860faca642ed117785e9295d46"} Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.648133 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7528" event={"ID":"422fbc9b-de01-4f7f-8be7-0036569e6dbb","Type":"ContainerStarted","Data":"e1558188865b44ded393a4327b191cece6f6de6ab9f6ee2d8a1ffdb4e425015c"} Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.650099 4677 generic.go:334] "Generic (PLEG): container finished" podID="e628b85d-09f9-4559-9853-b35c19a0e0e6" containerID="4093aef9c4180c39f96ba2ada5bafd85bd12939535adcf66451d8d88d509e04b" exitCode=0 Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.650367 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm2jz" event={"ID":"e628b85d-09f9-4559-9853-b35c19a0e0e6","Type":"ContainerDied","Data":"4093aef9c4180c39f96ba2ada5bafd85bd12939535adcf66451d8d88d509e04b"} Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.652358 4677 generic.go:334] "Generic (PLEG): container finished" podID="ad380640-a5ab-4fb9-838b-28b4732c597e" containerID="83407b93358e150cef70a1b7126ba3ace5e762f0a6e50a143c02f036ab285403" exitCode=0 Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.652398 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n428h" event={"ID":"ad380640-a5ab-4fb9-838b-28b4732c597e","Type":"ContainerDied","Data":"83407b93358e150cef70a1b7126ba3ace5e762f0a6e50a143c02f036ab285403"} Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.652425 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n428h" event={"ID":"ad380640-a5ab-4fb9-838b-28b4732c597e","Type":"ContainerStarted","Data":"68276fe87e1523f77fcb57d67ff67b8055ff01fa3490f65ed1d174ef98575b5f"} Oct 07 13:11:54 crc kubenswrapper[4677]: I1007 13:11:54.717091 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r7528" podStartSLOduration=2.230761904 podStartE2EDuration="4.717075471s" podCreationTimestamp="2025-10-07 13:11:50 +0000 UTC" firstStartedPulling="2025-10-07 13:11:51.623496204 +0000 UTC m=+283.109205339" lastFinishedPulling="2025-10-07 13:11:54.109809771 +0000 UTC m=+285.595518906" observedRunningTime="2025-10-07 13:11:54.714377143 +0000 UTC m=+286.200086308" watchObservedRunningTime="2025-10-07 13:11:54.717075471 +0000 UTC m=+286.202784586" Oct 07 13:11:55 crc kubenswrapper[4677]: I1007 13:11:55.658756 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n428h" event={"ID":"ad380640-a5ab-4fb9-838b-28b4732c597e","Type":"ContainerStarted","Data":"8fd7a532b73254b2cde5ce2e530d991af282e4b7acac404f46f5318a96c5fdd7"} Oct 07 13:11:55 crc kubenswrapper[4677]: I1007 13:11:55.663047 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm2jz" event={"ID":"e628b85d-09f9-4559-9853-b35c19a0e0e6","Type":"ContainerStarted","Data":"d35e4f3251d6605ef15c38f325e68ee85415287b9b09b240ed61dc9078cfd807"} Oct 07 13:11:55 crc kubenswrapper[4677]: I1007 13:11:55.699144 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pm2jz" podStartSLOduration=3.24961828 podStartE2EDuration="5.699125572s" podCreationTimestamp="2025-10-07 13:11:50 +0000 UTC" firstStartedPulling="2025-10-07 13:11:52.631844186 +0000 UTC m=+284.117553341" lastFinishedPulling="2025-10-07 13:11:55.081351518 +0000 UTC m=+286.567060633" observedRunningTime="2025-10-07 13:11:55.697734072 +0000 UTC m=+287.183443187" watchObservedRunningTime="2025-10-07 13:11:55.699125572 +0000 UTC m=+287.184834697" Oct 07 13:11:56 crc kubenswrapper[4677]: I1007 13:11:56.675655 4677 generic.go:334] "Generic (PLEG): container finished" podID="ad380640-a5ab-4fb9-838b-28b4732c597e" containerID="8fd7a532b73254b2cde5ce2e530d991af282e4b7acac404f46f5318a96c5fdd7" exitCode=0 Oct 07 13:11:56 crc kubenswrapper[4677]: I1007 13:11:56.675751 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n428h" event={"ID":"ad380640-a5ab-4fb9-838b-28b4732c597e","Type":"ContainerDied","Data":"8fd7a532b73254b2cde5ce2e530d991af282e4b7acac404f46f5318a96c5fdd7"} Oct 07 13:11:57 crc kubenswrapper[4677]: I1007 13:11:57.684862 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n428h" event={"ID":"ad380640-a5ab-4fb9-838b-28b4732c597e","Type":"ContainerStarted","Data":"f1641f8d0e4f314175f728dcd71f21719db17ffe1887899a60b7c65185953e9a"} Oct 07 13:11:57 crc kubenswrapper[4677]: I1007 13:11:57.688008 4677 generic.go:334] "Generic (PLEG): container finished" podID="a97dd5c5-2e20-4155-8829-acd24af4fe9f" containerID="4cc5f9015c41563d4095d0cf15cbeb0cd5f6137253247aad2ea083e6e0cc1b59" exitCode=0 Oct 07 13:11:57 crc kubenswrapper[4677]: I1007 13:11:57.688042 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjgk2" event={"ID":"a97dd5c5-2e20-4155-8829-acd24af4fe9f","Type":"ContainerDied","Data":"4cc5f9015c41563d4095d0cf15cbeb0cd5f6137253247aad2ea083e6e0cc1b59"} Oct 07 13:11:57 crc kubenswrapper[4677]: I1007 13:11:57.710859 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n428h" podStartSLOduration=2.068907749 podStartE2EDuration="4.710844583s" podCreationTimestamp="2025-10-07 13:11:53 +0000 UTC" firstStartedPulling="2025-10-07 13:11:54.653700989 +0000 UTC m=+286.139410104" lastFinishedPulling="2025-10-07 13:11:57.295637793 +0000 UTC m=+288.781346938" observedRunningTime="2025-10-07 13:11:57.709145474 +0000 UTC m=+289.194854599" watchObservedRunningTime="2025-10-07 13:11:57.710844583 +0000 UTC m=+289.196553698" Oct 07 13:11:58 crc kubenswrapper[4677]: I1007 13:11:58.696292 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjgk2" event={"ID":"a97dd5c5-2e20-4155-8829-acd24af4fe9f","Type":"ContainerStarted","Data":"40eb6b1ffe261cc3e570905c9ead196557bdb15102db161c15f0190c403b1c24"} Oct 07 13:11:58 crc kubenswrapper[4677]: I1007 13:11:58.714219 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pjgk2" podStartSLOduration=1.896351152 podStartE2EDuration="5.71420194s" podCreationTimestamp="2025-10-07 13:11:53 +0000 UTC" firstStartedPulling="2025-10-07 13:11:54.647073068 +0000 UTC m=+286.132782173" lastFinishedPulling="2025-10-07 13:11:58.464923846 +0000 UTC m=+289.950632961" observedRunningTime="2025-10-07 13:11:58.709490133 +0000 UTC m=+290.195199258" watchObservedRunningTime="2025-10-07 13:11:58.71420194 +0000 UTC m=+290.199911065" Oct 07 13:12:00 crc kubenswrapper[4677]: I1007 13:12:00.981719 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:12:00 crc kubenswrapper[4677]: I1007 13:12:00.981796 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:12:01 crc kubenswrapper[4677]: I1007 13:12:01.046820 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:12:01 crc kubenswrapper[4677]: I1007 13:12:01.195529 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:12:01 crc kubenswrapper[4677]: I1007 13:12:01.195583 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:12:01 crc kubenswrapper[4677]: I1007 13:12:01.237199 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:12:01 crc kubenswrapper[4677]: I1007 13:12:01.747812 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pm2jz" Oct 07 13:12:01 crc kubenswrapper[4677]: I1007 13:12:01.775176 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r7528" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.370783 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.371058 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.406037 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.588671 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n428h" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.588787 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n428h" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.628500 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n428h" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.762593 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pjgk2" Oct 07 13:12:03 crc kubenswrapper[4677]: I1007 13:12:03.767347 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n428h" Oct 07 13:13:10 crc kubenswrapper[4677]: I1007 13:13:10.918500 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:13:10 crc kubenswrapper[4677]: I1007 13:13:10.919219 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:13:40 crc kubenswrapper[4677]: I1007 13:13:40.917258 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:13:40 crc kubenswrapper[4677]: I1007 13:13:40.918112 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.233207 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6vg95"] Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.234379 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.251919 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6vg95"] Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396510 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2728adb-dce2-49bc-8763-00ce83fbbf33-trusted-ca\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396553 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2728adb-dce2-49bc-8763-00ce83fbbf33-registry-certificates\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396578 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gd8z\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-kube-api-access-4gd8z\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396617 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-bound-sa-token\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396647 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396692 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-registry-tls\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396713 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2728adb-dce2-49bc-8763-00ce83fbbf33-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.396742 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2728adb-dce2-49bc-8763-00ce83fbbf33-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.418122 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.498227 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-bound-sa-token\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.498753 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2728adb-dce2-49bc-8763-00ce83fbbf33-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.498989 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-registry-tls\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.499210 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2728adb-dce2-49bc-8763-00ce83fbbf33-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.499530 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2728adb-dce2-49bc-8763-00ce83fbbf33-trusted-ca\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.499776 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2728adb-dce2-49bc-8763-00ce83fbbf33-registry-certificates\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.499987 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gd8z\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-kube-api-access-4gd8z\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.499690 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2728adb-dce2-49bc-8763-00ce83fbbf33-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.500757 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2728adb-dce2-49bc-8763-00ce83fbbf33-registry-certificates\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.502828 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2728adb-dce2-49bc-8763-00ce83fbbf33-trusted-ca\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.507259 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2728adb-dce2-49bc-8763-00ce83fbbf33-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.507577 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-registry-tls\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.530283 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-bound-sa-token\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.536880 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gd8z\" (UniqueName: \"kubernetes.io/projected/f2728adb-dce2-49bc-8763-00ce83fbbf33-kube-api-access-4gd8z\") pod \"image-registry-66df7c8f76-6vg95\" (UID: \"f2728adb-dce2-49bc-8763-00ce83fbbf33\") " pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.555094 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:02 crc kubenswrapper[4677]: I1007 13:14:02.824020 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6vg95"] Oct 07 13:14:03 crc kubenswrapper[4677]: I1007 13:14:03.493997 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" event={"ID":"f2728adb-dce2-49bc-8763-00ce83fbbf33","Type":"ContainerStarted","Data":"3b0271c48abeee3c2c6bfb65cda988150081ea06b5ae85c647daa8722e635ef6"} Oct 07 13:14:03 crc kubenswrapper[4677]: I1007 13:14:03.494057 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" event={"ID":"f2728adb-dce2-49bc-8763-00ce83fbbf33","Type":"ContainerStarted","Data":"7783a1880a6fa85d1a02aa10795d13a6d6531796afb25181f9008a9f46bd2bb3"} Oct 07 13:14:03 crc kubenswrapper[4677]: I1007 13:14:03.494097 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:10 crc kubenswrapper[4677]: I1007 13:14:10.917373 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:14:10 crc kubenswrapper[4677]: I1007 13:14:10.918153 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:14:10 crc kubenswrapper[4677]: I1007 13:14:10.918223 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:14:10 crc kubenswrapper[4677]: I1007 13:14:10.919082 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82a5b7d40ad019c3c617ec7d72d51f1fbb5c958d11768560cf6d8828b0539b5b"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:14:10 crc kubenswrapper[4677]: I1007 13:14:10.919174 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://82a5b7d40ad019c3c617ec7d72d51f1fbb5c958d11768560cf6d8828b0539b5b" gracePeriod=600 Oct 07 13:14:11 crc kubenswrapper[4677]: I1007 13:14:11.547932 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="82a5b7d40ad019c3c617ec7d72d51f1fbb5c958d11768560cf6d8828b0539b5b" exitCode=0 Oct 07 13:14:11 crc kubenswrapper[4677]: I1007 13:14:11.548008 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"82a5b7d40ad019c3c617ec7d72d51f1fbb5c958d11768560cf6d8828b0539b5b"} Oct 07 13:14:11 crc kubenswrapper[4677]: I1007 13:14:11.548646 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"75d4db8c22e96ea7fcbf447dc088ac317cb51f5a548c1df77f076e3a1152231a"} Oct 07 13:14:11 crc kubenswrapper[4677]: I1007 13:14:11.548693 4677 scope.go:117] "RemoveContainer" containerID="5d3e4ef8267212ad1faf24bfcb3b6f633a283684ba587e304e94d434bd9a2618" Oct 07 13:14:11 crc kubenswrapper[4677]: I1007 13:14:11.567261 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" podStartSLOduration=9.567244702 podStartE2EDuration="9.567244702s" podCreationTimestamp="2025-10-07 13:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:14:03.519170415 +0000 UTC m=+415.004879560" watchObservedRunningTime="2025-10-07 13:14:11.567244702 +0000 UTC m=+423.052953807" Oct 07 13:14:22 crc kubenswrapper[4677]: I1007 13:14:22.562597 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6vg95" Oct 07 13:14:22 crc kubenswrapper[4677]: I1007 13:14:22.640861 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bldn4"] Oct 07 13:14:47 crc kubenswrapper[4677]: I1007 13:14:47.692796 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" podUID="51dd4275-14c4-459b-a065-46ae2b4fd741" containerName="registry" containerID="cri-o://81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24" gracePeriod=30 Oct 07 13:14:47 crc kubenswrapper[4677]: I1007 13:14:47.890856 4677 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-bldn4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.13:5000/healthz\": dial tcp 10.217.0.13:5000: connect: connection refused" start-of-body= Oct 07 13:14:47 crc kubenswrapper[4677]: I1007 13:14:47.890951 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" podUID="51dd4275-14c4-459b-a065-46ae2b4fd741" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.13:5000/healthz\": dial tcp 10.217.0.13:5000: connect: connection refused" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.077348 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.103200 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-bound-sa-token\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.103635 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-certificates\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.103681 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-trusted-ca\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.103798 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.103830 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9vbb\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-kube-api-access-t9vbb\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.105256 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.105471 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.138124 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-kube-api-access-t9vbb" (OuterVolumeSpecName: "kube-api-access-t9vbb") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "kube-api-access-t9vbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.138328 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.138668 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.204303 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51dd4275-14c4-459b-a065-46ae2b4fd741-installation-pull-secrets\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.204367 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-tls\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.204393 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51dd4275-14c4-459b-a065-46ae2b4fd741-ca-trust-extracted\") pod \"51dd4275-14c4-459b-a065-46ae2b4fd741\" (UID: \"51dd4275-14c4-459b-a065-46ae2b4fd741\") " Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.204545 4677 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.204557 4677 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.204567 4677 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51dd4275-14c4-459b-a065-46ae2b4fd741-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.204578 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9vbb\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-kube-api-access-t9vbb\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.207648 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.210284 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51dd4275-14c4-459b-a065-46ae2b4fd741-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.232259 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51dd4275-14c4-459b-a065-46ae2b4fd741-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "51dd4275-14c4-459b-a065-46ae2b4fd741" (UID: "51dd4275-14c4-459b-a065-46ae2b4fd741"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.305873 4677 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51dd4275-14c4-459b-a065-46ae2b4fd741-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.305928 4677 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51dd4275-14c4-459b-a065-46ae2b4fd741-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.305950 4677 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51dd4275-14c4-459b-a065-46ae2b4fd741-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.811705 4677 generic.go:334] "Generic (PLEG): container finished" podID="51dd4275-14c4-459b-a065-46ae2b4fd741" containerID="81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24" exitCode=0 Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.811788 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" event={"ID":"51dd4275-14c4-459b-a065-46ae2b4fd741","Type":"ContainerDied","Data":"81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24"} Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.811839 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" event={"ID":"51dd4275-14c4-459b-a065-46ae2b4fd741","Type":"ContainerDied","Data":"3681657c57c95ca89d6ca84dcad897a7c45f44f2efe24da666bd9ff2b8f6f9b1"} Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.811907 4677 scope.go:117] "RemoveContainer" containerID="81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.812206 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bldn4" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.849197 4677 scope.go:117] "RemoveContainer" containerID="81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24" Oct 07 13:14:48 crc kubenswrapper[4677]: E1007 13:14:48.849906 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24\": container with ID starting with 81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24 not found: ID does not exist" containerID="81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.849988 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24"} err="failed to get container status \"81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24\": rpc error: code = NotFound desc = could not find container \"81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24\": container with ID starting with 81d2ac496dadeb51e9fb436d0d472e486d28871dfdba9827e72fbcf63f87ca24 not found: ID does not exist" Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.867366 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bldn4"] Oct 07 13:14:48 crc kubenswrapper[4677]: I1007 13:14:48.877549 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bldn4"] Oct 07 13:14:49 crc kubenswrapper[4677]: I1007 13:14:49.315471 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51dd4275-14c4-459b-a065-46ae2b4fd741" path="/var/lib/kubelet/pods/51dd4275-14c4-459b-a065-46ae2b4fd741/volumes" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.161878 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f"] Oct 07 13:15:00 crc kubenswrapper[4677]: E1007 13:15:00.162659 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dd4275-14c4-459b-a065-46ae2b4fd741" containerName="registry" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.162687 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dd4275-14c4-459b-a065-46ae2b4fd741" containerName="registry" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.162881 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="51dd4275-14c4-459b-a065-46ae2b4fd741" containerName="registry" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.163629 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.168341 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.169179 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.174206 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f"] Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.276068 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75jb\" (UniqueName: \"kubernetes.io/projected/bb3cbf3e-6351-4579-8180-9880390f5246-kube-api-access-z75jb\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.276123 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb3cbf3e-6351-4579-8180-9880390f5246-secret-volume\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.276172 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb3cbf3e-6351-4579-8180-9880390f5246-config-volume\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.377796 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb3cbf3e-6351-4579-8180-9880390f5246-config-volume\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.377928 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75jb\" (UniqueName: \"kubernetes.io/projected/bb3cbf3e-6351-4579-8180-9880390f5246-kube-api-access-z75jb\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.378023 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb3cbf3e-6351-4579-8180-9880390f5246-secret-volume\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.380369 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb3cbf3e-6351-4579-8180-9880390f5246-config-volume\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.384725 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb3cbf3e-6351-4579-8180-9880390f5246-secret-volume\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.408181 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75jb\" (UniqueName: \"kubernetes.io/projected/bb3cbf3e-6351-4579-8180-9880390f5246-kube-api-access-z75jb\") pod \"collect-profiles-29330715-rmk7f\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.516456 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.782327 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f"] Oct 07 13:15:00 crc kubenswrapper[4677]: I1007 13:15:00.903855 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" event={"ID":"bb3cbf3e-6351-4579-8180-9880390f5246","Type":"ContainerStarted","Data":"836ade1aecc9fb676e69829258b2599ccc2b9e39791402f115d3fbc1718ed565"} Oct 07 13:15:01 crc kubenswrapper[4677]: I1007 13:15:01.912992 4677 generic.go:334] "Generic (PLEG): container finished" podID="bb3cbf3e-6351-4579-8180-9880390f5246" containerID="ee063b47652695b90775a5afa8558e74c1a080aeaf50f65fb028b57b98c1efdd" exitCode=0 Oct 07 13:15:01 crc kubenswrapper[4677]: I1007 13:15:01.913078 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" event={"ID":"bb3cbf3e-6351-4579-8180-9880390f5246","Type":"ContainerDied","Data":"ee063b47652695b90775a5afa8558e74c1a080aeaf50f65fb028b57b98c1efdd"} Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.222848 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.321658 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb3cbf3e-6351-4579-8180-9880390f5246-config-volume\") pod \"bb3cbf3e-6351-4579-8180-9880390f5246\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.321729 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75jb\" (UniqueName: \"kubernetes.io/projected/bb3cbf3e-6351-4579-8180-9880390f5246-kube-api-access-z75jb\") pod \"bb3cbf3e-6351-4579-8180-9880390f5246\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.321783 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb3cbf3e-6351-4579-8180-9880390f5246-secret-volume\") pod \"bb3cbf3e-6351-4579-8180-9880390f5246\" (UID: \"bb3cbf3e-6351-4579-8180-9880390f5246\") " Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.322603 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3cbf3e-6351-4579-8180-9880390f5246-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb3cbf3e-6351-4579-8180-9880390f5246" (UID: "bb3cbf3e-6351-4579-8180-9880390f5246"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.329773 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3cbf3e-6351-4579-8180-9880390f5246-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bb3cbf3e-6351-4579-8180-9880390f5246" (UID: "bb3cbf3e-6351-4579-8180-9880390f5246"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.329847 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3cbf3e-6351-4579-8180-9880390f5246-kube-api-access-z75jb" (OuterVolumeSpecName: "kube-api-access-z75jb") pod "bb3cbf3e-6351-4579-8180-9880390f5246" (UID: "bb3cbf3e-6351-4579-8180-9880390f5246"). InnerVolumeSpecName "kube-api-access-z75jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.423284 4677 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb3cbf3e-6351-4579-8180-9880390f5246-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.423347 4677 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb3cbf3e-6351-4579-8180-9880390f5246-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.423371 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75jb\" (UniqueName: \"kubernetes.io/projected/bb3cbf3e-6351-4579-8180-9880390f5246-kube-api-access-z75jb\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.934621 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" event={"ID":"bb3cbf3e-6351-4579-8180-9880390f5246","Type":"ContainerDied","Data":"836ade1aecc9fb676e69829258b2599ccc2b9e39791402f115d3fbc1718ed565"} Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.934753 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="836ade1aecc9fb676e69829258b2599ccc2b9e39791402f115d3fbc1718ed565" Oct 07 13:15:03 crc kubenswrapper[4677]: I1007 13:15:03.934763 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-rmk7f" Oct 07 13:16:09 crc kubenswrapper[4677]: I1007 13:16:09.508729 4677 scope.go:117] "RemoveContainer" containerID="da5f756ed2ce40fda4eaff3fc908b803777660990da37cf7bb6c8460c2373e85" Oct 07 13:16:40 crc kubenswrapper[4677]: I1007 13:16:40.917482 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:16:40 crc kubenswrapper[4677]: I1007 13:16:40.919558 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.991533 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29c8j"] Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.992544 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="nbdb" containerID="cri-o://99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7" gracePeriod=30 Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.992617 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="northd" containerID="cri-o://eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0" gracePeriod=30 Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.992624 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503" gracePeriod=30 Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.992707 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-node" containerID="cri-o://1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3" gracePeriod=30 Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.992729 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="sbdb" containerID="cri-o://b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca" gracePeriod=30 Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.992789 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-acl-logging" containerID="cri-o://f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925" gracePeriod=30 Oct 07 13:17:09 crc kubenswrapper[4677]: I1007 13:17:09.992509 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-controller" containerID="cri-o://3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da" gracePeriod=30 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.024151 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" containerID="cri-o://4931c26a24a9442024978a83085456f080f6de6d5f334a435bcc6ced01d30f93" gracePeriod=30 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.776027 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovnkube-controller/3.log" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.779651 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovn-acl-logging/0.log" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.780352 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovn-controller/0.log" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781133 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="4931c26a24a9442024978a83085456f080f6de6d5f334a435bcc6ced01d30f93" exitCode=0 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781171 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca" exitCode=0 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781189 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7" exitCode=0 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781209 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0" exitCode=0 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781224 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503" exitCode=0 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781238 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3" exitCode=0 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781233 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"4931c26a24a9442024978a83085456f080f6de6d5f334a435bcc6ced01d30f93"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781252 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925" exitCode=143 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781360 4677 generic.go:334] "Generic (PLEG): container finished" podID="3458826a-000d-407d-92c8-236d1a05842e" containerID="3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da" exitCode=143 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781340 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781402 4677 scope.go:117] "RemoveContainer" containerID="b0636ac68feadc31552df6dee8669b4b1d477332b3405bff9bd63eaaa3362a6f" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781462 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781506 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781533 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781560 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781583 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781603 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781621 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" event={"ID":"3458826a-000d-407d-92c8-236d1a05842e","Type":"ContainerDied","Data":"3277b1e8f93bc16de57864a4db5b92424ff2e8ca9b2fee80d78fd814453a172f"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.781638 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3277b1e8f93bc16de57864a4db5b92424ff2e8ca9b2fee80d78fd814453a172f" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.784579 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/2.log" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.785510 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/1.log" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.785573 4677 generic.go:334] "Generic (PLEG): container finished" podID="73bebfb3-50b5-48b6-b348-1d1feb6202d2" containerID="cab6ba341a7d3ec923ec6a10fba00b684271e2e0c030e0ed8b119f472414895a" exitCode=2 Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.785608 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerDied","Data":"cab6ba341a7d3ec923ec6a10fba00b684271e2e0c030e0ed8b119f472414895a"} Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.786154 4677 scope.go:117] "RemoveContainer" containerID="cab6ba341a7d3ec923ec6a10fba00b684271e2e0c030e0ed8b119f472414895a" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.786362 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pjgpx_openshift-multus(73bebfb3-50b5-48b6-b348-1d1feb6202d2)\"" pod="openshift-multus/multus-pjgpx" podUID="73bebfb3-50b5-48b6-b348-1d1feb6202d2" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.820720 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovn-acl-logging/0.log" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.821677 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovn-controller/0.log" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.822289 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.825531 4677 scope.go:117] "RemoveContainer" containerID="fcea2caf828321399fab99a6225cb39dd0c4aba8481cc040a10d86e90b6e4029" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.892486 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s8rzt"] Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.892850 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.892883 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.892902 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.892916 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.892933 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="sbdb" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.892948 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="sbdb" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.892965 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="northd" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.892978 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="northd" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.892995 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893008 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893028 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-node" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893040 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-node" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893060 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3cbf3e-6351-4579-8180-9880390f5246" containerName="collect-profiles" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893075 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3cbf3e-6351-4579-8180-9880390f5246" containerName="collect-profiles" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893095 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893109 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893131 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kubecfg-setup" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893148 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kubecfg-setup" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893176 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="nbdb" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893193 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="nbdb" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893218 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-acl-logging" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893238 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-acl-logging" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893265 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893278 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893482 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893501 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3cbf3e-6351-4579-8180-9880390f5246" containerName="collect-profiles" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893518 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893539 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893557 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893574 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="northd" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893596 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893614 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893630 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="kube-rbac-proxy-node" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893649 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893666 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="nbdb" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893681 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="sbdb" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893696 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovn-acl-logging" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893902 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893919 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: E1007 13:17:10.893938 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.893950 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3458826a-000d-407d-92c8-236d1a05842e" containerName="ovnkube-controller" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.896182 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.918146 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.918194 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.977956 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-openvswitch\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978299 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-slash\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978408 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-env-overrides\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978531 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-ovn-kubernetes\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978563 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978574 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-slash" (OuterVolumeSpecName: "host-slash") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978599 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978618 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-systemd-units\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978734 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-config\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978764 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-netns\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978792 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3458826a-000d-407d-92c8-236d1a05842e-ovn-node-metrics-cert\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978829 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-node-log\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978848 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-systemd\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978894 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-var-lib-openvswitch\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978930 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978952 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm7l2\" (UniqueName: \"kubernetes.io/projected/3458826a-000d-407d-92c8-236d1a05842e-kube-api-access-vm7l2\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978974 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-ovn\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.978999 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-bin\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979016 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-log-socket\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979812 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-etc-openvswitch\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979041 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979853 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-netd\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979101 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979879 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-script-lib\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979921 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-kubelet\") pod \"3458826a-000d-407d-92c8-236d1a05842e\" (UID: \"3458826a-000d-407d-92c8-236d1a05842e\") " Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979158 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979955 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979190 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979215 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-log-socket" (OuterVolumeSpecName: "log-socket") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979279 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980065 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-ovn\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980115 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-log-socket\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980147 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980185 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-systemd-units\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980204 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-run-netns\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980233 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-systemd\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979259 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979243 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-node-log" (OuterVolumeSpecName: "node-log") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979669 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.979891 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980108 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980415 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.980852 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-slash\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.981224 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xdt\" (UniqueName: \"kubernetes.io/projected/e322094c-541b-4dd8-9834-015cc0fd21f7-kube-api-access-l2xdt\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.981333 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-ovnkube-config\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.981712 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-cni-bin\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.981892 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-node-log\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982020 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-env-overrides\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982191 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982201 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-cni-netd\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982349 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-etc-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982513 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-ovnkube-script-lib\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982702 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e322094c-541b-4dd8-9834-015cc0fd21f7-ovn-node-metrics-cert\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982816 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.982937 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983046 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-var-lib-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983086 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-kubelet\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983164 4677 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983187 4677 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983196 4677 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983209 4677 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983219 4677 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983227 4677 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983236 4677 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983244 4677 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983252 4677 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983260 4677 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983268 4677 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983276 4677 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983285 4677 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983293 4677 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983301 4677 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983310 4677 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:10 crc kubenswrapper[4677]: I1007 13:17:10.983319 4677 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3458826a-000d-407d-92c8-236d1a05842e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:10.998060 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3458826a-000d-407d-92c8-236d1a05842e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:10.998282 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3458826a-000d-407d-92c8-236d1a05842e-kube-api-access-vm7l2" (OuterVolumeSpecName: "kube-api-access-vm7l2") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "kube-api-access-vm7l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.007190 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "3458826a-000d-407d-92c8-236d1a05842e" (UID: "3458826a-000d-407d-92c8-236d1a05842e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.084964 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-etc-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085068 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-ovnkube-script-lib\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085097 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-etc-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085116 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e322094c-541b-4dd8-9834-015cc0fd21f7-ovn-node-metrics-cert\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085220 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085284 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085327 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-var-lib-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085358 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-kubelet\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085390 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-ovn\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085388 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085422 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-log-socket\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085458 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085495 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-ovn\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085475 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085635 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-systemd-units\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085505 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-var-lib-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085722 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-run-netns\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085473 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-kubelet\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085509 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-log-socket\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085681 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-run-netns\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085830 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-systemd\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085681 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-systemd-units\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085882 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-slash\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085912 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-systemd\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085917 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xdt\" (UniqueName: \"kubernetes.io/projected/e322094c-541b-4dd8-9834-015cc0fd21f7-kube-api-access-l2xdt\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085950 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-slash\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085964 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-ovnkube-config\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.085505 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-run-openvswitch\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086010 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-cni-bin\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086044 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-node-log\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086064 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-cni-bin\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086078 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-env-overrides\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086117 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-cni-netd\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086119 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-node-log\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086212 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e322094c-541b-4dd8-9834-015cc0fd21f7-host-cni-netd\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086246 4677 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3458826a-000d-407d-92c8-236d1a05842e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086362 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm7l2\" (UniqueName: \"kubernetes.io/projected/3458826a-000d-407d-92c8-236d1a05842e-kube-api-access-vm7l2\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086383 4677 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3458826a-000d-407d-92c8-236d1a05842e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.086673 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-ovnkube-config\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.087025 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-env-overrides\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.087177 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e322094c-541b-4dd8-9834-015cc0fd21f7-ovnkube-script-lib\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.091212 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e322094c-541b-4dd8-9834-015cc0fd21f7-ovn-node-metrics-cert\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.106519 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xdt\" (UniqueName: \"kubernetes.io/projected/e322094c-541b-4dd8-9834-015cc0fd21f7-kube-api-access-l2xdt\") pod \"ovnkube-node-s8rzt\" (UID: \"e322094c-541b-4dd8-9834-015cc0fd21f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.215370 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:11 crc kubenswrapper[4677]: W1007 13:17:11.241950 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode322094c_541b_4dd8_9834_015cc0fd21f7.slice/crio-37ccfe6cd0dd89d037921880297c3f08ddfbaa44db481021d9db3b0173805837 WatchSource:0}: Error finding container 37ccfe6cd0dd89d037921880297c3f08ddfbaa44db481021d9db3b0173805837: Status 404 returned error can't find the container with id 37ccfe6cd0dd89d037921880297c3f08ddfbaa44db481021d9db3b0173805837 Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.801117 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovn-acl-logging/0.log" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.801998 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29c8j_3458826a-000d-407d-92c8-236d1a05842e/ovn-controller/0.log" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.803002 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29c8j" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.805558 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/2.log" Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.809101 4677 generic.go:334] "Generic (PLEG): container finished" podID="e322094c-541b-4dd8-9834-015cc0fd21f7" containerID="a24fbbb4cc5350d59abddffeacfe5f6f925d93b27f79dc1ef660e288ff6f48e2" exitCode=0 Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.809160 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerDied","Data":"a24fbbb4cc5350d59abddffeacfe5f6f925d93b27f79dc1ef660e288ff6f48e2"} Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.809201 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"37ccfe6cd0dd89d037921880297c3f08ddfbaa44db481021d9db3b0173805837"} Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.874326 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29c8j"] Oct 07 13:17:11 crc kubenswrapper[4677]: I1007 13:17:11.877866 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29c8j"] Oct 07 13:17:12 crc kubenswrapper[4677]: I1007 13:17:12.822036 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"b8bc3ea9bd82f14285696b5d756f9c6767842a599a4065010e71545d06a4fbcb"} Oct 07 13:17:12 crc kubenswrapper[4677]: I1007 13:17:12.822276 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"c6cc7bb412d117d9ee18c624511e5016d8bfafba0c04a019f1b1dfc80c4e5925"} Oct 07 13:17:12 crc kubenswrapper[4677]: I1007 13:17:12.822291 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"6dbd36b73212b2916be384b55f22698a01d2118554a4fec851984d22e2c66637"} Oct 07 13:17:12 crc kubenswrapper[4677]: I1007 13:17:12.822302 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"55f9db19f937ecf37d8688dac79d9119a633dc61b592482c9bfc2fa8498765ae"} Oct 07 13:17:12 crc kubenswrapper[4677]: I1007 13:17:12.822314 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"69120d142c7b16ba75c68f815942fa4e75f7cd87c53788c611dfd48d3d56e6db"} Oct 07 13:17:13 crc kubenswrapper[4677]: I1007 13:17:13.314614 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3458826a-000d-407d-92c8-236d1a05842e" path="/var/lib/kubelet/pods/3458826a-000d-407d-92c8-236d1a05842e/volumes" Oct 07 13:17:13 crc kubenswrapper[4677]: I1007 13:17:13.830770 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"278f0bff154f98b75962b3eb99551af810e36be2ce8d8acbfc4e4bdeabea536c"} Oct 07 13:17:15 crc kubenswrapper[4677]: I1007 13:17:15.849694 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"619747c7e5e4a3752f0f1bb3de452dbb1cc5a30a34c178a55979d6c61271f0b6"} Oct 07 13:17:17 crc kubenswrapper[4677]: I1007 13:17:17.865673 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" event={"ID":"e322094c-541b-4dd8-9834-015cc0fd21f7","Type":"ContainerStarted","Data":"c32dc487547d45e886708ffa938ca1e3b67350ed9699e7e21ada5af8ad43c207"} Oct 07 13:17:17 crc kubenswrapper[4677]: I1007 13:17:17.866078 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:17 crc kubenswrapper[4677]: I1007 13:17:17.866256 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:17 crc kubenswrapper[4677]: I1007 13:17:17.866273 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:17 crc kubenswrapper[4677]: I1007 13:17:17.898347 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:17 crc kubenswrapper[4677]: I1007 13:17:17.911791 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:17 crc kubenswrapper[4677]: I1007 13:17:17.937553 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" podStartSLOduration=7.937530651 podStartE2EDuration="7.937530651s" podCreationTimestamp="2025-10-07 13:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:17.906016888 +0000 UTC m=+609.391726013" watchObservedRunningTime="2025-10-07 13:17:17.937530651 +0000 UTC m=+609.423239806" Oct 07 13:17:21 crc kubenswrapper[4677]: I1007 13:17:21.304252 4677 scope.go:117] "RemoveContainer" containerID="cab6ba341a7d3ec923ec6a10fba00b684271e2e0c030e0ed8b119f472414895a" Oct 07 13:17:21 crc kubenswrapper[4677]: E1007 13:17:21.304888 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pjgpx_openshift-multus(73bebfb3-50b5-48b6-b348-1d1feb6202d2)\"" pod="openshift-multus/multus-pjgpx" podUID="73bebfb3-50b5-48b6-b348-1d1feb6202d2" Oct 07 13:17:32 crc kubenswrapper[4677]: I1007 13:17:32.304053 4677 scope.go:117] "RemoveContainer" containerID="cab6ba341a7d3ec923ec6a10fba00b684271e2e0c030e0ed8b119f472414895a" Oct 07 13:17:32 crc kubenswrapper[4677]: I1007 13:17:32.961336 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjgpx_73bebfb3-50b5-48b6-b348-1d1feb6202d2/kube-multus/2.log" Oct 07 13:17:32 crc kubenswrapper[4677]: I1007 13:17:32.961721 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjgpx" event={"ID":"73bebfb3-50b5-48b6-b348-1d1feb6202d2","Type":"ContainerStarted","Data":"a1992df5fef886ad4d861887a5487db96fb963176145780b42372df0e3190378"} Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.258063 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-d49rj"] Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.259694 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d49rj" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.263766 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-t4vjp" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.264128 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.268167 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.272498 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-d49rj"] Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.407863 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prktl\" (UniqueName: \"kubernetes.io/projected/6acd9fda-3fa3-4c78-ba60-29336c7c6078-kube-api-access-prktl\") pod \"mariadb-operator-index-d49rj\" (UID: \"6acd9fda-3fa3-4c78-ba60-29336c7c6078\") " pod="openstack-operators/mariadb-operator-index-d49rj" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.509501 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prktl\" (UniqueName: \"kubernetes.io/projected/6acd9fda-3fa3-4c78-ba60-29336c7c6078-kube-api-access-prktl\") pod \"mariadb-operator-index-d49rj\" (UID: \"6acd9fda-3fa3-4c78-ba60-29336c7c6078\") " pod="openstack-operators/mariadb-operator-index-d49rj" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.528018 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prktl\" (UniqueName: \"kubernetes.io/projected/6acd9fda-3fa3-4c78-ba60-29336c7c6078-kube-api-access-prktl\") pod \"mariadb-operator-index-d49rj\" (UID: \"6acd9fda-3fa3-4c78-ba60-29336c7c6078\") " pod="openstack-operators/mariadb-operator-index-d49rj" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.577145 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d49rj" Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.864658 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-d49rj"] Oct 07 13:17:39 crc kubenswrapper[4677]: I1007 13:17:39.874839 4677 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:17:40 crc kubenswrapper[4677]: I1007 13:17:40.007237 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d49rj" event={"ID":"6acd9fda-3fa3-4c78-ba60-29336c7c6078","Type":"ContainerStarted","Data":"51cec9ddd2921cf14ddbd9ead70058e2809002ad03164944d6361ba29e439b65"} Oct 07 13:17:40 crc kubenswrapper[4677]: I1007 13:17:40.918294 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:17:40 crc kubenswrapper[4677]: I1007 13:17:40.918799 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:17:40 crc kubenswrapper[4677]: I1007 13:17:40.918858 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:17:40 crc kubenswrapper[4677]: I1007 13:17:40.919564 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75d4db8c22e96ea7fcbf447dc088ac317cb51f5a548c1df77f076e3a1152231a"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:17:40 crc kubenswrapper[4677]: I1007 13:17:40.919642 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://75d4db8c22e96ea7fcbf447dc088ac317cb51f5a548c1df77f076e3a1152231a" gracePeriod=600 Oct 07 13:17:41 crc kubenswrapper[4677]: I1007 13:17:41.240962 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8rzt" Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.025182 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d49rj" event={"ID":"6acd9fda-3fa3-4c78-ba60-29336c7c6078","Type":"ContainerStarted","Data":"b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155"} Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.028740 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="75d4db8c22e96ea7fcbf447dc088ac317cb51f5a548c1df77f076e3a1152231a" exitCode=0 Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.028779 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"75d4db8c22e96ea7fcbf447dc088ac317cb51f5a548c1df77f076e3a1152231a"} Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.028809 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"ec42d23040a8452012f48d89f3054555831bcfb79cefb8a91a87385178f388c8"} Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.028840 4677 scope.go:117] "RemoveContainer" containerID="82a5b7d40ad019c3c617ec7d72d51f1fbb5c958d11768560cf6d8828b0539b5b" Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.036923 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-d49rj"] Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.052476 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-d49rj" podStartSLOduration=2.093339262 podStartE2EDuration="3.05241318s" podCreationTimestamp="2025-10-07 13:17:39 +0000 UTC" firstStartedPulling="2025-10-07 13:17:39.874560454 +0000 UTC m=+631.360269569" lastFinishedPulling="2025-10-07 13:17:40.833634332 +0000 UTC m=+632.319343487" observedRunningTime="2025-10-07 13:17:42.049065704 +0000 UTC m=+633.534774899" watchObservedRunningTime="2025-10-07 13:17:42.05241318 +0000 UTC m=+633.538122345" Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.645459 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-6fln4"] Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.646531 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.685105 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-6fln4"] Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.756190 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q9sm\" (UniqueName: \"kubernetes.io/projected/32459eb8-0dbe-4046-9798-85e4cb9aca83-kube-api-access-9q9sm\") pod \"mariadb-operator-index-6fln4\" (UID: \"32459eb8-0dbe-4046-9798-85e4cb9aca83\") " pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.858086 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q9sm\" (UniqueName: \"kubernetes.io/projected/32459eb8-0dbe-4046-9798-85e4cb9aca83-kube-api-access-9q9sm\") pod \"mariadb-operator-index-6fln4\" (UID: \"32459eb8-0dbe-4046-9798-85e4cb9aca83\") " pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.892828 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q9sm\" (UniqueName: \"kubernetes.io/projected/32459eb8-0dbe-4046-9798-85e4cb9aca83-kube-api-access-9q9sm\") pod \"mariadb-operator-index-6fln4\" (UID: \"32459eb8-0dbe-4046-9798-85e4cb9aca83\") " pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:42 crc kubenswrapper[4677]: I1007 13:17:42.988297 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:43 crc kubenswrapper[4677]: I1007 13:17:43.458528 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-6fln4"] Oct 07 13:17:43 crc kubenswrapper[4677]: W1007 13:17:43.468116 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32459eb8_0dbe_4046_9798_85e4cb9aca83.slice/crio-461200fbe2f54ad933da07954839b23fb1fe9ada098eb0f323a584b80cef2b88 WatchSource:0}: Error finding container 461200fbe2f54ad933da07954839b23fb1fe9ada098eb0f323a584b80cef2b88: Status 404 returned error can't find the container with id 461200fbe2f54ad933da07954839b23fb1fe9ada098eb0f323a584b80cef2b88 Oct 07 13:17:44 crc kubenswrapper[4677]: I1007 13:17:44.048766 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-6fln4" event={"ID":"32459eb8-0dbe-4046-9798-85e4cb9aca83","Type":"ContainerStarted","Data":"461200fbe2f54ad933da07954839b23fb1fe9ada098eb0f323a584b80cef2b88"} Oct 07 13:17:44 crc kubenswrapper[4677]: I1007 13:17:44.048868 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-d49rj" podUID="6acd9fda-3fa3-4c78-ba60-29336c7c6078" containerName="registry-server" containerID="cri-o://b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155" gracePeriod=2 Oct 07 13:17:44 crc kubenswrapper[4677]: I1007 13:17:44.403703 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d49rj" Oct 07 13:17:44 crc kubenswrapper[4677]: I1007 13:17:44.477457 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prktl\" (UniqueName: \"kubernetes.io/projected/6acd9fda-3fa3-4c78-ba60-29336c7c6078-kube-api-access-prktl\") pod \"6acd9fda-3fa3-4c78-ba60-29336c7c6078\" (UID: \"6acd9fda-3fa3-4c78-ba60-29336c7c6078\") " Oct 07 13:17:44 crc kubenswrapper[4677]: I1007 13:17:44.484639 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acd9fda-3fa3-4c78-ba60-29336c7c6078-kube-api-access-prktl" (OuterVolumeSpecName: "kube-api-access-prktl") pod "6acd9fda-3fa3-4c78-ba60-29336c7c6078" (UID: "6acd9fda-3fa3-4c78-ba60-29336c7c6078"). InnerVolumeSpecName "kube-api-access-prktl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:44 crc kubenswrapper[4677]: I1007 13:17:44.579195 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prktl\" (UniqueName: \"kubernetes.io/projected/6acd9fda-3fa3-4c78-ba60-29336c7c6078-kube-api-access-prktl\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.059993 4677 generic.go:334] "Generic (PLEG): container finished" podID="6acd9fda-3fa3-4c78-ba60-29336c7c6078" containerID="b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155" exitCode=0 Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.060052 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d49rj" Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.060055 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d49rj" event={"ID":"6acd9fda-3fa3-4c78-ba60-29336c7c6078","Type":"ContainerDied","Data":"b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155"} Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.060115 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d49rj" event={"ID":"6acd9fda-3fa3-4c78-ba60-29336c7c6078","Type":"ContainerDied","Data":"51cec9ddd2921cf14ddbd9ead70058e2809002ad03164944d6361ba29e439b65"} Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.060144 4677 scope.go:117] "RemoveContainer" containerID="b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155" Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.062541 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-6fln4" event={"ID":"32459eb8-0dbe-4046-9798-85e4cb9aca83","Type":"ContainerStarted","Data":"971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357"} Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.091392 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-6fln4" podStartSLOduration=2.590694352 podStartE2EDuration="3.091363107s" podCreationTimestamp="2025-10-07 13:17:42 +0000 UTC" firstStartedPulling="2025-10-07 13:17:43.472384472 +0000 UTC m=+634.958093597" lastFinishedPulling="2025-10-07 13:17:43.973053207 +0000 UTC m=+635.458762352" observedRunningTime="2025-10-07 13:17:45.088620849 +0000 UTC m=+636.574330054" watchObservedRunningTime="2025-10-07 13:17:45.091363107 +0000 UTC m=+636.577072292" Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.098419 4677 scope.go:117] "RemoveContainer" containerID="b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155" Oct 07 13:17:45 crc kubenswrapper[4677]: E1007 13:17:45.099089 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155\": container with ID starting with b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155 not found: ID does not exist" containerID="b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155" Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.099139 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155"} err="failed to get container status \"b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155\": rpc error: code = NotFound desc = could not find container \"b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155\": container with ID starting with b0ffaf1939360c22a838c17b96ee7fcb9570e029ef88cb1d8a43e780b97ca155 not found: ID does not exist" Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.108961 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-d49rj"] Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.114738 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-d49rj"] Oct 07 13:17:45 crc kubenswrapper[4677]: I1007 13:17:45.314627 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6acd9fda-3fa3-4c78-ba60-29336c7c6078" path="/var/lib/kubelet/pods/6acd9fda-3fa3-4c78-ba60-29336c7c6078/volumes" Oct 07 13:17:52 crc kubenswrapper[4677]: I1007 13:17:52.990570 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:52 crc kubenswrapper[4677]: I1007 13:17:52.991267 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:53 crc kubenswrapper[4677]: I1007 13:17:53.035754 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:53 crc kubenswrapper[4677]: I1007 13:17:53.163243 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.696318 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn"] Oct 07 13:17:58 crc kubenswrapper[4677]: E1007 13:17:58.696999 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acd9fda-3fa3-4c78-ba60-29336c7c6078" containerName="registry-server" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.697027 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acd9fda-3fa3-4c78-ba60-29336c7c6078" containerName="registry-server" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.697199 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="6acd9fda-3fa3-4c78-ba60-29336c7c6078" containerName="registry-server" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.698572 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.701647 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-krhf2" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.704065 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn"] Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.871782 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-util\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.871873 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-bundle\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.871941 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bb8x\" (UniqueName: \"kubernetes.io/projected/0f60c875-0061-46ee-b9b9-8d0dc56ad361-kube-api-access-8bb8x\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.972734 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-util\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.972854 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-bundle\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.972912 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bb8x\" (UniqueName: \"kubernetes.io/projected/0f60c875-0061-46ee-b9b9-8d0dc56ad361-kube-api-access-8bb8x\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.973738 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-util\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.974004 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-bundle\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:58 crc kubenswrapper[4677]: I1007 13:17:58.994471 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bb8x\" (UniqueName: \"kubernetes.io/projected/0f60c875-0061-46ee-b9b9-8d0dc56ad361-kube-api-access-8bb8x\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:59 crc kubenswrapper[4677]: I1007 13:17:59.029830 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:17:59 crc kubenswrapper[4677]: I1007 13:17:59.282225 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn"] Oct 07 13:17:59 crc kubenswrapper[4677]: W1007 13:17:59.288676 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f60c875_0061_46ee_b9b9_8d0dc56ad361.slice/crio-5f1bb4bbba34b2ddffbc1aedcfd7d1b02c59ce0d6792d8472b5393f4e352c4f4 WatchSource:0}: Error finding container 5f1bb4bbba34b2ddffbc1aedcfd7d1b02c59ce0d6792d8472b5393f4e352c4f4: Status 404 returned error can't find the container with id 5f1bb4bbba34b2ddffbc1aedcfd7d1b02c59ce0d6792d8472b5393f4e352c4f4 Oct 07 13:18:00 crc kubenswrapper[4677]: I1007 13:18:00.184069 4677 generic.go:334] "Generic (PLEG): container finished" podID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerID="8f821c67dc0ae0b965267cb00001c219393d199afcf15b092b853968e37b488c" exitCode=0 Oct 07 13:18:00 crc kubenswrapper[4677]: I1007 13:18:00.184133 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" event={"ID":"0f60c875-0061-46ee-b9b9-8d0dc56ad361","Type":"ContainerDied","Data":"8f821c67dc0ae0b965267cb00001c219393d199afcf15b092b853968e37b488c"} Oct 07 13:18:00 crc kubenswrapper[4677]: I1007 13:18:00.184172 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" event={"ID":"0f60c875-0061-46ee-b9b9-8d0dc56ad361","Type":"ContainerStarted","Data":"5f1bb4bbba34b2ddffbc1aedcfd7d1b02c59ce0d6792d8472b5393f4e352c4f4"} Oct 07 13:18:02 crc kubenswrapper[4677]: I1007 13:18:02.199689 4677 generic.go:334] "Generic (PLEG): container finished" podID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerID="452b3bae6954682929a386fc3244997b62e463ca95f0a3e27a24a05b67cd9044" exitCode=0 Oct 07 13:18:02 crc kubenswrapper[4677]: I1007 13:18:02.199876 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" event={"ID":"0f60c875-0061-46ee-b9b9-8d0dc56ad361","Type":"ContainerDied","Data":"452b3bae6954682929a386fc3244997b62e463ca95f0a3e27a24a05b67cd9044"} Oct 07 13:18:03 crc kubenswrapper[4677]: I1007 13:18:03.208700 4677 generic.go:334] "Generic (PLEG): container finished" podID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerID="d1aa332bb0e5dac07aa681dedf35083ea91ca2bc6b8112e181d9836d8d1a6540" exitCode=0 Oct 07 13:18:03 crc kubenswrapper[4677]: I1007 13:18:03.208895 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" event={"ID":"0f60c875-0061-46ee-b9b9-8d0dc56ad361","Type":"ContainerDied","Data":"d1aa332bb0e5dac07aa681dedf35083ea91ca2bc6b8112e181d9836d8d1a6540"} Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.507262 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.676175 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-bundle\") pod \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.677181 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bb8x\" (UniqueName: \"kubernetes.io/projected/0f60c875-0061-46ee-b9b9-8d0dc56ad361-kube-api-access-8bb8x\") pod \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.677896 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-util\") pod \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\" (UID: \"0f60c875-0061-46ee-b9b9-8d0dc56ad361\") " Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.677102 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-bundle" (OuterVolumeSpecName: "bundle") pod "0f60c875-0061-46ee-b9b9-8d0dc56ad361" (UID: "0f60c875-0061-46ee-b9b9-8d0dc56ad361"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.678338 4677 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.684096 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f60c875-0061-46ee-b9b9-8d0dc56ad361-kube-api-access-8bb8x" (OuterVolumeSpecName: "kube-api-access-8bb8x") pod "0f60c875-0061-46ee-b9b9-8d0dc56ad361" (UID: "0f60c875-0061-46ee-b9b9-8d0dc56ad361"). InnerVolumeSpecName "kube-api-access-8bb8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.691211 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-util" (OuterVolumeSpecName: "util") pod "0f60c875-0061-46ee-b9b9-8d0dc56ad361" (UID: "0f60c875-0061-46ee-b9b9-8d0dc56ad361"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.779523 4677 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f60c875-0061-46ee-b9b9-8d0dc56ad361-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:04 crc kubenswrapper[4677]: I1007 13:18:04.779562 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bb8x\" (UniqueName: \"kubernetes.io/projected/0f60c875-0061-46ee-b9b9-8d0dc56ad361-kube-api-access-8bb8x\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:05 crc kubenswrapper[4677]: I1007 13:18:05.230310 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" event={"ID":"0f60c875-0061-46ee-b9b9-8d0dc56ad361","Type":"ContainerDied","Data":"5f1bb4bbba34b2ddffbc1aedcfd7d1b02c59ce0d6792d8472b5393f4e352c4f4"} Oct 07 13:18:05 crc kubenswrapper[4677]: I1007 13:18:05.230382 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f1bb4bbba34b2ddffbc1aedcfd7d1b02c59ce0d6792d8472b5393f4e352c4f4" Oct 07 13:18:05 crc kubenswrapper[4677]: I1007 13:18:05.230546 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.579105 4677 scope.go:117] "RemoveContainer" containerID="fa229d0805576aa04db8e58e52e8c8bdc07663414dde42a94cac4fe628cfac7a" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.598486 4677 scope.go:117] "RemoveContainer" containerID="4931c26a24a9442024978a83085456f080f6de6d5f334a435bcc6ced01d30f93" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.628846 4677 scope.go:117] "RemoveContainer" containerID="eee7c253a1a514447553be977a3e534608ef6a1178664bf139ee84ec41180db0" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.645721 4677 scope.go:117] "RemoveContainer" containerID="f333db7aeb7d3cd308131992b4cd1284c1c56e27bbfd731404febc0efc953925" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.666930 4677 scope.go:117] "RemoveContainer" containerID="99b5fbb5ad3249aa5264c37bd635ed5f6283ec72c7eb071002cd7bddc12052f7" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.684930 4677 scope.go:117] "RemoveContainer" containerID="b77b2aafb3baf1c5b72d62156bd1c1bec76385637d5795166fe3d4f22a169503" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.706776 4677 scope.go:117] "RemoveContainer" containerID="1cf7d8cdd34bc883eae38c5e4690efd4e1c29cc633b5bbadc5de2b5b844a9da3" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.734740 4677 scope.go:117] "RemoveContainer" containerID="3ddf4e352b778815786f6fb204486a53d958310e53569f89a2895fe388a727da" Oct 07 13:18:09 crc kubenswrapper[4677]: I1007 13:18:09.747751 4677 scope.go:117] "RemoveContainer" containerID="b1f410624ff7e026c196d43d5ef830ce7b34981b703d5399a135dab0122640ca" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.417377 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7"] Oct 07 13:18:12 crc kubenswrapper[4677]: E1007 13:18:12.417927 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerName="util" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.417940 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerName="util" Oct 07 13:18:12 crc kubenswrapper[4677]: E1007 13:18:12.417949 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerName="pull" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.417957 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerName="pull" Oct 07 13:18:12 crc kubenswrapper[4677]: E1007 13:18:12.417967 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerName="extract" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.417973 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerName="extract" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.418061 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" containerName="extract" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.418619 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.422479 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.422567 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.422601 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6kwrd" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.435672 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7"] Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.582955 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5gm4\" (UniqueName: \"kubernetes.io/projected/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-kube-api-access-n5gm4\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.582999 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-apiservice-cert\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.583032 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-webhook-cert\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.684094 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5gm4\" (UniqueName: \"kubernetes.io/projected/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-kube-api-access-n5gm4\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.684151 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-apiservice-cert\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.684197 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-webhook-cert\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.690251 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-webhook-cert\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.691732 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-apiservice-cert\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.699611 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5gm4\" (UniqueName: \"kubernetes.io/projected/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-kube-api-access-n5gm4\") pod \"mariadb-operator-controller-manager-847ff55875-g5gf7\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.734644 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:12 crc kubenswrapper[4677]: W1007 13:18:12.955290 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1f08780_3bfe_4470_87d2_05bfaf5d89ce.slice/crio-8d84ae3d0dc7d359a9af69efc25bae9e038ec65a6a8d74dcb63263cd957ef120 WatchSource:0}: Error finding container 8d84ae3d0dc7d359a9af69efc25bae9e038ec65a6a8d74dcb63263cd957ef120: Status 404 returned error can't find the container with id 8d84ae3d0dc7d359a9af69efc25bae9e038ec65a6a8d74dcb63263cd957ef120 Oct 07 13:18:12 crc kubenswrapper[4677]: I1007 13:18:12.956051 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7"] Oct 07 13:18:13 crc kubenswrapper[4677]: I1007 13:18:13.285850 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" event={"ID":"f1f08780-3bfe-4470-87d2-05bfaf5d89ce","Type":"ContainerStarted","Data":"8d84ae3d0dc7d359a9af69efc25bae9e038ec65a6a8d74dcb63263cd957ef120"} Oct 07 13:18:16 crc kubenswrapper[4677]: I1007 13:18:16.321975 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" event={"ID":"f1f08780-3bfe-4470-87d2-05bfaf5d89ce","Type":"ContainerStarted","Data":"544fb2e8b8509a0c4dfac25f799cad7c582e63dfe6cedf6652d38e359ad8ec15"} Oct 07 13:18:19 crc kubenswrapper[4677]: I1007 13:18:19.337102 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" event={"ID":"f1f08780-3bfe-4470-87d2-05bfaf5d89ce","Type":"ContainerStarted","Data":"4eab8717e6659894ada0e2205510ff0edba5e637138f2090e1b917320033e79b"} Oct 07 13:18:19 crc kubenswrapper[4677]: I1007 13:18:19.337343 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:19 crc kubenswrapper[4677]: I1007 13:18:19.354136 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" podStartSLOduration=1.747743599 podStartE2EDuration="7.354115932s" podCreationTimestamp="2025-10-07 13:18:12 +0000 UTC" firstStartedPulling="2025-10-07 13:18:12.959424173 +0000 UTC m=+664.445133288" lastFinishedPulling="2025-10-07 13:18:18.565796486 +0000 UTC m=+670.051505621" observedRunningTime="2025-10-07 13:18:19.351584019 +0000 UTC m=+670.837293154" watchObservedRunningTime="2025-10-07 13:18:19.354115932 +0000 UTC m=+670.839825057" Oct 07 13:18:22 crc kubenswrapper[4677]: I1007 13:18:22.742101 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:18:23 crc kubenswrapper[4677]: I1007 13:18:23.998907 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz"] Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.000294 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.005327 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.013493 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz"] Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.135727 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.135838 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.135876 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjlm\" (UniqueName: \"kubernetes.io/projected/987fd203-f583-44fe-b845-d33510d6bb30-kube-api-access-mdjlm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.237482 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.237787 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.237880 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjlm\" (UniqueName: \"kubernetes.io/projected/987fd203-f583-44fe-b845-d33510d6bb30-kube-api-access-mdjlm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.238294 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.238366 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.257378 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjlm\" (UniqueName: \"kubernetes.io/projected/987fd203-f583-44fe-b845-d33510d6bb30-kube-api-access-mdjlm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.316559 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:24 crc kubenswrapper[4677]: I1007 13:18:24.551327 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz"] Oct 07 13:18:25 crc kubenswrapper[4677]: I1007 13:18:25.375070 4677 generic.go:334] "Generic (PLEG): container finished" podID="987fd203-f583-44fe-b845-d33510d6bb30" containerID="6d8cd2988130cd03b508251f353b12da742c94ba57d4b38bf40b6be68bd5f5d6" exitCode=0 Oct 07 13:18:25 crc kubenswrapper[4677]: I1007 13:18:25.375121 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" event={"ID":"987fd203-f583-44fe-b845-d33510d6bb30","Type":"ContainerDied","Data":"6d8cd2988130cd03b508251f353b12da742c94ba57d4b38bf40b6be68bd5f5d6"} Oct 07 13:18:25 crc kubenswrapper[4677]: I1007 13:18:25.375479 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" event={"ID":"987fd203-f583-44fe-b845-d33510d6bb30","Type":"ContainerStarted","Data":"60da900dc270d2f5f36c9fd83b21d3f607b07bda43e335dd8249ff09b4c56f6c"} Oct 07 13:18:27 crc kubenswrapper[4677]: I1007 13:18:27.390653 4677 generic.go:334] "Generic (PLEG): container finished" podID="987fd203-f583-44fe-b845-d33510d6bb30" containerID="11ad8e8a1d5d0abc564b850c5548bd674e45e0183202bb6ec64ea1411de811ae" exitCode=0 Oct 07 13:18:27 crc kubenswrapper[4677]: I1007 13:18:27.390724 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" event={"ID":"987fd203-f583-44fe-b845-d33510d6bb30","Type":"ContainerDied","Data":"11ad8e8a1d5d0abc564b850c5548bd674e45e0183202bb6ec64ea1411de811ae"} Oct 07 13:18:28 crc kubenswrapper[4677]: I1007 13:18:28.397201 4677 generic.go:334] "Generic (PLEG): container finished" podID="987fd203-f583-44fe-b845-d33510d6bb30" containerID="64e7635a924a5cc24cad63b4127166da716040f254610fa9b77f3c59bc12dc00" exitCode=0 Oct 07 13:18:28 crc kubenswrapper[4677]: I1007 13:18:28.397290 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" event={"ID":"987fd203-f583-44fe-b845-d33510d6bb30","Type":"ContainerDied","Data":"64e7635a924a5cc24cad63b4127166da716040f254610fa9b77f3c59bc12dc00"} Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.714002 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.807904 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-util\") pod \"987fd203-f583-44fe-b845-d33510d6bb30\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.807960 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdjlm\" (UniqueName: \"kubernetes.io/projected/987fd203-f583-44fe-b845-d33510d6bb30-kube-api-access-mdjlm\") pod \"987fd203-f583-44fe-b845-d33510d6bb30\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.808004 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-bundle\") pod \"987fd203-f583-44fe-b845-d33510d6bb30\" (UID: \"987fd203-f583-44fe-b845-d33510d6bb30\") " Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.809504 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-bundle" (OuterVolumeSpecName: "bundle") pod "987fd203-f583-44fe-b845-d33510d6bb30" (UID: "987fd203-f583-44fe-b845-d33510d6bb30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.815586 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987fd203-f583-44fe-b845-d33510d6bb30-kube-api-access-mdjlm" (OuterVolumeSpecName: "kube-api-access-mdjlm") pod "987fd203-f583-44fe-b845-d33510d6bb30" (UID: "987fd203-f583-44fe-b845-d33510d6bb30"). InnerVolumeSpecName "kube-api-access-mdjlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.838034 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-util" (OuterVolumeSpecName: "util") pod "987fd203-f583-44fe-b845-d33510d6bb30" (UID: "987fd203-f583-44fe-b845-d33510d6bb30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.909067 4677 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.909112 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdjlm\" (UniqueName: \"kubernetes.io/projected/987fd203-f583-44fe-b845-d33510d6bb30-kube-api-access-mdjlm\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:29 crc kubenswrapper[4677]: I1007 13:18:29.909129 4677 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/987fd203-f583-44fe-b845-d33510d6bb30-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:30 crc kubenswrapper[4677]: I1007 13:18:30.416425 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" event={"ID":"987fd203-f583-44fe-b845-d33510d6bb30","Type":"ContainerDied","Data":"60da900dc270d2f5f36c9fd83b21d3f607b07bda43e335dd8249ff09b4c56f6c"} Oct 07 13:18:30 crc kubenswrapper[4677]: I1007 13:18:30.416769 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60da900dc270d2f5f36c9fd83b21d3f607b07bda43e335dd8249ff09b4c56f6c" Oct 07 13:18:30 crc kubenswrapper[4677]: I1007 13:18:30.416598 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.545054 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh"] Oct 07 13:18:38 crc kubenswrapper[4677]: E1007 13:18:38.545768 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987fd203-f583-44fe-b845-d33510d6bb30" containerName="util" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.545781 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="987fd203-f583-44fe-b845-d33510d6bb30" containerName="util" Oct 07 13:18:38 crc kubenswrapper[4677]: E1007 13:18:38.545791 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987fd203-f583-44fe-b845-d33510d6bb30" containerName="pull" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.545798 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="987fd203-f583-44fe-b845-d33510d6bb30" containerName="pull" Oct 07 13:18:38 crc kubenswrapper[4677]: E1007 13:18:38.545814 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987fd203-f583-44fe-b845-d33510d6bb30" containerName="extract" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.545820 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="987fd203-f583-44fe-b845-d33510d6bb30" containerName="extract" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.545929 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="987fd203-f583-44fe-b845-d33510d6bb30" containerName="extract" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.546302 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.548184 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.548737 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.548799 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.548906 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.551679 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rrx9l" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.574103 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh"] Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.712731 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-webhook-cert\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.712808 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-apiservice-cert\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.712875 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqqvz\" (UniqueName: \"kubernetes.io/projected/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-kube-api-access-vqqvz\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.774576 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh"] Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.775288 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.777241 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.777984 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.778045 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7wh66" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.795293 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh"] Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.813608 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-webhook-cert\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.813694 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-apiservice-cert\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.813763 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqqvz\" (UniqueName: \"kubernetes.io/projected/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-kube-api-access-vqqvz\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.818851 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-webhook-cert\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.825365 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-apiservice-cert\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.834502 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqqvz\" (UniqueName: \"kubernetes.io/projected/3c7d4a9a-028b-4131-8a8e-722259f8cd2c-kube-api-access-vqqvz\") pod \"metallb-operator-controller-manager-859694fc4f-rbbdh\" (UID: \"3c7d4a9a-028b-4131-8a8e-722259f8cd2c\") " pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.860675 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.915028 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/407281a9-edd0-4bc9-bb53-dee866971f52-webhook-cert\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.915101 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52v7\" (UniqueName: \"kubernetes.io/projected/407281a9-edd0-4bc9-bb53-dee866971f52-kube-api-access-l52v7\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:38 crc kubenswrapper[4677]: I1007 13:18:38.915192 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/407281a9-edd0-4bc9-bb53-dee866971f52-apiservice-cert\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.016565 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52v7\" (UniqueName: \"kubernetes.io/projected/407281a9-edd0-4bc9-bb53-dee866971f52-kube-api-access-l52v7\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.017157 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/407281a9-edd0-4bc9-bb53-dee866971f52-apiservice-cert\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.017200 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/407281a9-edd0-4bc9-bb53-dee866971f52-webhook-cert\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.021374 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/407281a9-edd0-4bc9-bb53-dee866971f52-webhook-cert\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.023872 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/407281a9-edd0-4bc9-bb53-dee866971f52-apiservice-cert\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.035904 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52v7\" (UniqueName: \"kubernetes.io/projected/407281a9-edd0-4bc9-bb53-dee866971f52-kube-api-access-l52v7\") pod \"metallb-operator-webhook-server-54787cf69c-lk2bh\" (UID: \"407281a9-edd0-4bc9-bb53-dee866971f52\") " pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.087369 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.142902 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh"] Oct 07 13:18:39 crc kubenswrapper[4677]: W1007 13:18:39.157406 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c7d4a9a_028b_4131_8a8e_722259f8cd2c.slice/crio-4d017f6d408b6a0830205fb082332756dd2b3d21122a21ac6656c05e7f2ec7b9 WatchSource:0}: Error finding container 4d017f6d408b6a0830205fb082332756dd2b3d21122a21ac6656c05e7f2ec7b9: Status 404 returned error can't find the container with id 4d017f6d408b6a0830205fb082332756dd2b3d21122a21ac6656c05e7f2ec7b9 Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.374589 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh"] Oct 07 13:18:39 crc kubenswrapper[4677]: W1007 13:18:39.380037 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod407281a9_edd0_4bc9_bb53_dee866971f52.slice/crio-6503ccdd2ee291291b078529f5413c7959705e2574486560dbc2c62703d85483 WatchSource:0}: Error finding container 6503ccdd2ee291291b078529f5413c7959705e2574486560dbc2c62703d85483: Status 404 returned error can't find the container with id 6503ccdd2ee291291b078529f5413c7959705e2574486560dbc2c62703d85483 Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.471379 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" event={"ID":"407281a9-edd0-4bc9-bb53-dee866971f52","Type":"ContainerStarted","Data":"6503ccdd2ee291291b078529f5413c7959705e2574486560dbc2c62703d85483"} Oct 07 13:18:39 crc kubenswrapper[4677]: I1007 13:18:39.472418 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" event={"ID":"3c7d4a9a-028b-4131-8a8e-722259f8cd2c","Type":"ContainerStarted","Data":"4d017f6d408b6a0830205fb082332756dd2b3d21122a21ac6656c05e7f2ec7b9"} Oct 07 13:18:44 crc kubenswrapper[4677]: I1007 13:18:44.504576 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" event={"ID":"3c7d4a9a-028b-4131-8a8e-722259f8cd2c","Type":"ContainerStarted","Data":"1f08da3957f417289e334d74ba0b17b5d3a365b9bfeb5d33396b3b552b4d0f0d"} Oct 07 13:18:44 crc kubenswrapper[4677]: I1007 13:18:44.505192 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:18:44 crc kubenswrapper[4677]: I1007 13:18:44.506092 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" event={"ID":"407281a9-edd0-4bc9-bb53-dee866971f52","Type":"ContainerStarted","Data":"e3de9926696abbce855b1725682d00f2628ed3f88ab273f91ad92c73c335191b"} Oct 07 13:18:44 crc kubenswrapper[4677]: I1007 13:18:44.506417 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:44 crc kubenswrapper[4677]: I1007 13:18:44.557605 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" podStartSLOduration=1.940217069 podStartE2EDuration="6.557577786s" podCreationTimestamp="2025-10-07 13:18:38 +0000 UTC" firstStartedPulling="2025-10-07 13:18:39.161548759 +0000 UTC m=+690.647257874" lastFinishedPulling="2025-10-07 13:18:43.778909466 +0000 UTC m=+695.264618591" observedRunningTime="2025-10-07 13:18:44.536665836 +0000 UTC m=+696.022374951" watchObservedRunningTime="2025-10-07 13:18:44.557577786 +0000 UTC m=+696.043286941" Oct 07 13:18:59 crc kubenswrapper[4677]: I1007 13:18:59.095730 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" Oct 07 13:18:59 crc kubenswrapper[4677]: I1007 13:18:59.116572 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-54787cf69c-lk2bh" podStartSLOduration=16.708979157999998 podStartE2EDuration="21.116552296s" podCreationTimestamp="2025-10-07 13:18:38 +0000 UTC" firstStartedPulling="2025-10-07 13:18:39.383704733 +0000 UTC m=+690.869413848" lastFinishedPulling="2025-10-07 13:18:43.791277861 +0000 UTC m=+695.276986986" observedRunningTime="2025-10-07 13:18:44.558009488 +0000 UTC m=+696.043718603" watchObservedRunningTime="2025-10-07 13:18:59.116552296 +0000 UTC m=+710.602261431" Oct 07 13:19:18 crc kubenswrapper[4677]: I1007 13:19:18.864567 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-859694fc4f-rbbdh" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.607872 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2"] Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.608847 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.611119 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-s2cp4" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.611378 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.611532 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hq5q5"] Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.614100 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.615787 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.616389 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.641030 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2"] Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651130 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-reloader\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651177 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651217 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90595934-7ad8-4e4d-b918-2f3e63d63e34-cert\") pod \"frr-k8s-webhook-server-64bf5d555-g8px2\" (UID: \"90595934-7ad8-4e4d-b918-2f3e63d63e34\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651237 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-conf\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651252 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics-certs\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651272 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l69bb\" (UniqueName: \"kubernetes.io/projected/c6f0a46e-3591-4f18-9ff7-867b546b2273-kube-api-access-l69bb\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651292 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2pks\" (UniqueName: \"kubernetes.io/projected/90595934-7ad8-4e4d-b918-2f3e63d63e34-kube-api-access-z2pks\") pod \"frr-k8s-webhook-server-64bf5d555-g8px2\" (UID: \"90595934-7ad8-4e4d-b918-2f3e63d63e34\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651310 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-startup\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.651326 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-sockets\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.710331 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-z6n69"] Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.711134 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.715805 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.716116 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.716199 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.716870 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-92865" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.728209 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-kmjvc"] Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.729333 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.731688 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752264 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-reloader\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752310 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752348 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrmql\" (UniqueName: \"kubernetes.io/projected/6e40d031-a727-42d1-af91-4f20c3c10fef-kube-api-access-mrmql\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752390 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90595934-7ad8-4e4d-b918-2f3e63d63e34-cert\") pod \"frr-k8s-webhook-server-64bf5d555-g8px2\" (UID: \"90595934-7ad8-4e4d-b918-2f3e63d63e34\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752419 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-conf\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752458 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics-certs\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752490 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l69bb\" (UniqueName: \"kubernetes.io/projected/c6f0a46e-3591-4f18-9ff7-867b546b2273-kube-api-access-l69bb\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752515 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2pks\" (UniqueName: \"kubernetes.io/projected/90595934-7ad8-4e4d-b918-2f3e63d63e34-kube-api-access-z2pks\") pod \"frr-k8s-webhook-server-64bf5d555-g8px2\" (UID: \"90595934-7ad8-4e4d-b918-2f3e63d63e34\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752537 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-startup\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752567 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-sockets\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752598 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-metrics-certs\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752617 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.752637 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6e40d031-a727-42d1-af91-4f20c3c10fef-metallb-excludel2\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.753038 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-reloader\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.753244 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-kmjvc"] Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.753248 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.754016 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-conf\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: E1007 13:19:19.754196 4677 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 07 13:19:19 crc kubenswrapper[4677]: E1007 13:19:19.754324 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics-certs podName:c6f0a46e-3591-4f18-9ff7-867b546b2273 nodeName:}" failed. No retries permitted until 2025-10-07 13:19:20.254305781 +0000 UTC m=+731.740014996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics-certs") pod "frr-k8s-hq5q5" (UID: "c6f0a46e-3591-4f18-9ff7-867b546b2273") : secret "frr-k8s-certs-secret" not found Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.754873 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-sockets\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.756268 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c6f0a46e-3591-4f18-9ff7-867b546b2273-frr-startup\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.772170 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90595934-7ad8-4e4d-b918-2f3e63d63e34-cert\") pod \"frr-k8s-webhook-server-64bf5d555-g8px2\" (UID: \"90595934-7ad8-4e4d-b918-2f3e63d63e34\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.775990 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2pks\" (UniqueName: \"kubernetes.io/projected/90595934-7ad8-4e4d-b918-2f3e63d63e34-kube-api-access-z2pks\") pod \"frr-k8s-webhook-server-64bf5d555-g8px2\" (UID: \"90595934-7ad8-4e4d-b918-2f3e63d63e34\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.779914 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l69bb\" (UniqueName: \"kubernetes.io/projected/c6f0a46e-3591-4f18-9ff7-867b546b2273-kube-api-access-l69bb\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.853994 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-metrics-certs\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.854292 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.854382 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6e40d031-a727-42d1-af91-4f20c3c10fef-metallb-excludel2\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: E1007 13:19:19.854169 4677 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 07 13:19:19 crc kubenswrapper[4677]: E1007 13:19:19.854510 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-metrics-certs podName:6e40d031-a727-42d1-af91-4f20c3c10fef nodeName:}" failed. No retries permitted until 2025-10-07 13:19:20.354494591 +0000 UTC m=+731.840203706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-metrics-certs") pod "speaker-z6n69" (UID: "6e40d031-a727-42d1-af91-4f20c3c10fef") : secret "speaker-certs-secret" not found Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.854570 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p6vc\" (UniqueName: \"kubernetes.io/projected/fc03bc45-d39d-4302-839a-1f89960e640f-kube-api-access-4p6vc\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.854657 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc03bc45-d39d-4302-839a-1f89960e640f-cert\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.854771 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc03bc45-d39d-4302-839a-1f89960e640f-metrics-certs\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.854874 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrmql\" (UniqueName: \"kubernetes.io/projected/6e40d031-a727-42d1-af91-4f20c3c10fef-kube-api-access-mrmql\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: E1007 13:19:19.854338 4677 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.855323 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6e40d031-a727-42d1-af91-4f20c3c10fef-metallb-excludel2\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: E1007 13:19:19.855340 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist podName:6e40d031-a727-42d1-af91-4f20c3c10fef nodeName:}" failed. No retries permitted until 2025-10-07 13:19:20.355323515 +0000 UTC m=+731.841032630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist") pod "speaker-z6n69" (UID: "6e40d031-a727-42d1-af91-4f20c3c10fef") : secret "metallb-memberlist" not found Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.875173 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrmql\" (UniqueName: \"kubernetes.io/projected/6e40d031-a727-42d1-af91-4f20c3c10fef-kube-api-access-mrmql\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.924667 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.956398 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p6vc\" (UniqueName: \"kubernetes.io/projected/fc03bc45-d39d-4302-839a-1f89960e640f-kube-api-access-4p6vc\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.956914 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc03bc45-d39d-4302-839a-1f89960e640f-cert\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.957021 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc03bc45-d39d-4302-839a-1f89960e640f-metrics-certs\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.959744 4677 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.961294 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc03bc45-d39d-4302-839a-1f89960e640f-metrics-certs\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.971079 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc03bc45-d39d-4302-839a-1f89960e640f-cert\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:19 crc kubenswrapper[4677]: I1007 13:19:19.972390 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p6vc\" (UniqueName: \"kubernetes.io/projected/fc03bc45-d39d-4302-839a-1f89960e640f-kube-api-access-4p6vc\") pod \"controller-68d546b9d8-kmjvc\" (UID: \"fc03bc45-d39d-4302-839a-1f89960e640f\") " pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.047011 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.130065 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2"] Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.260402 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics-certs\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.268011 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6f0a46e-3591-4f18-9ff7-867b546b2273-metrics-certs\") pod \"frr-k8s-hq5q5\" (UID: \"c6f0a46e-3591-4f18-9ff7-867b546b2273\") " pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.361332 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-metrics-certs\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.361385 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:20 crc kubenswrapper[4677]: E1007 13:19:20.361604 4677 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 13:19:20 crc kubenswrapper[4677]: E1007 13:19:20.361667 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist podName:6e40d031-a727-42d1-af91-4f20c3c10fef nodeName:}" failed. No retries permitted until 2025-10-07 13:19:21.361650958 +0000 UTC m=+732.847360083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist") pod "speaker-z6n69" (UID: "6e40d031-a727-42d1-af91-4f20c3c10fef") : secret "metallb-memberlist" not found Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.368196 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-metrics-certs\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.475626 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-kmjvc"] Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.532383 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.735756 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-kmjvc" event={"ID":"fc03bc45-d39d-4302-839a-1f89960e640f","Type":"ContainerStarted","Data":"87021d0379704655315d0ccb9fec83548ad0eae992e597bfb13d312fe5ffe589"} Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.735792 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-kmjvc" event={"ID":"fc03bc45-d39d-4302-839a-1f89960e640f","Type":"ContainerStarted","Data":"f66544331f6dbae3e59de3f1d40a41d64cb1846bd716401afe1dac5f6337e9c2"} Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.736907 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" event={"ID":"90595934-7ad8-4e4d-b918-2f3e63d63e34","Type":"ContainerStarted","Data":"7788a5b6d717a792bc944439ec89a799a8589cd328a3b038598465bd3775fb11"} Oct 07 13:19:20 crc kubenswrapper[4677]: I1007 13:19:20.737925 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerStarted","Data":"701d1d5c20b9df92fc1b3ebc506f40b51701f476e652f0abb2063a1c1536b076"} Oct 07 13:19:21 crc kubenswrapper[4677]: I1007 13:19:21.375745 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:21 crc kubenswrapper[4677]: I1007 13:19:21.385256 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6e40d031-a727-42d1-af91-4f20c3c10fef-memberlist\") pod \"speaker-z6n69\" (UID: \"6e40d031-a727-42d1-af91-4f20c3c10fef\") " pod="metallb-system/speaker-z6n69" Oct 07 13:19:21 crc kubenswrapper[4677]: I1007 13:19:21.524365 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z6n69" Oct 07 13:19:21 crc kubenswrapper[4677]: W1007 13:19:21.595607 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e40d031_a727_42d1_af91_4f20c3c10fef.slice/crio-9f48c2edab9e7687156aaf6237f6b5559bfe04f5bd8c70684f1eac5e67240258 WatchSource:0}: Error finding container 9f48c2edab9e7687156aaf6237f6b5559bfe04f5bd8c70684f1eac5e67240258: Status 404 returned error can't find the container with id 9f48c2edab9e7687156aaf6237f6b5559bfe04f5bd8c70684f1eac5e67240258 Oct 07 13:19:21 crc kubenswrapper[4677]: I1007 13:19:21.766872 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z6n69" event={"ID":"6e40d031-a727-42d1-af91-4f20c3c10fef","Type":"ContainerStarted","Data":"9f48c2edab9e7687156aaf6237f6b5559bfe04f5bd8c70684f1eac5e67240258"} Oct 07 13:19:22 crc kubenswrapper[4677]: I1007 13:19:22.772351 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z6n69" event={"ID":"6e40d031-a727-42d1-af91-4f20c3c10fef","Type":"ContainerStarted","Data":"05d65fcd3359c1ecbcecdadffba1233b08424e18c291fe8888788d18a6a143b9"} Oct 07 13:19:24 crc kubenswrapper[4677]: I1007 13:19:24.785739 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-kmjvc" event={"ID":"fc03bc45-d39d-4302-839a-1f89960e640f","Type":"ContainerStarted","Data":"e7c128e7d33c8a9a069c3636e246caa6fc25aa204b0062798be900dbe6687c11"} Oct 07 13:19:24 crc kubenswrapper[4677]: I1007 13:19:24.786115 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:24 crc kubenswrapper[4677]: I1007 13:19:24.787672 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z6n69" event={"ID":"6e40d031-a727-42d1-af91-4f20c3c10fef","Type":"ContainerStarted","Data":"b9ad8dbc0fed3272253b7f8db991469d40e5d55e36111d395c2eda112686231e"} Oct 07 13:19:24 crc kubenswrapper[4677]: I1007 13:19:24.787782 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-z6n69" Oct 07 13:19:24 crc kubenswrapper[4677]: I1007 13:19:24.805867 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-kmjvc" podStartSLOduration=2.504737275 podStartE2EDuration="5.805832648s" podCreationTimestamp="2025-10-07 13:19:19 +0000 UTC" firstStartedPulling="2025-10-07 13:19:20.638854486 +0000 UTC m=+732.124563601" lastFinishedPulling="2025-10-07 13:19:23.939949859 +0000 UTC m=+735.425658974" observedRunningTime="2025-10-07 13:19:24.799272229 +0000 UTC m=+736.284981344" watchObservedRunningTime="2025-10-07 13:19:24.805832648 +0000 UTC m=+736.291541763" Oct 07 13:19:24 crc kubenswrapper[4677]: I1007 13:19:24.814371 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-z6n69" podStartSLOduration=3.783968753 podStartE2EDuration="5.814346772s" podCreationTimestamp="2025-10-07 13:19:19 +0000 UTC" firstStartedPulling="2025-10-07 13:19:21.908024626 +0000 UTC m=+733.393733741" lastFinishedPulling="2025-10-07 13:19:23.938402655 +0000 UTC m=+735.424111760" observedRunningTime="2025-10-07 13:19:24.811687456 +0000 UTC m=+736.297396571" watchObservedRunningTime="2025-10-07 13:19:24.814346772 +0000 UTC m=+736.300055887" Oct 07 13:19:27 crc kubenswrapper[4677]: I1007 13:19:27.817663 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" event={"ID":"90595934-7ad8-4e4d-b918-2f3e63d63e34","Type":"ContainerStarted","Data":"4498c639011b68adaced6ed6b0b367d52111fb587815aceb2fe74a64e1846fbf"} Oct 07 13:19:27 crc kubenswrapper[4677]: I1007 13:19:27.817773 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:27 crc kubenswrapper[4677]: I1007 13:19:27.820704 4677 generic.go:334] "Generic (PLEG): container finished" podID="c6f0a46e-3591-4f18-9ff7-867b546b2273" containerID="03028acb54a20ea2c90479e00b532a7662c21a4177f1bd6426955c0c987d0162" exitCode=0 Oct 07 13:19:27 crc kubenswrapper[4677]: I1007 13:19:27.820747 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerDied","Data":"03028acb54a20ea2c90479e00b532a7662c21a4177f1bd6426955c0c987d0162"} Oct 07 13:19:27 crc kubenswrapper[4677]: I1007 13:19:27.840355 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" podStartSLOduration=1.570759819 podStartE2EDuration="8.840332158s" podCreationTimestamp="2025-10-07 13:19:19 +0000 UTC" firstStartedPulling="2025-10-07 13:19:20.146022181 +0000 UTC m=+731.631731306" lastFinishedPulling="2025-10-07 13:19:27.41559452 +0000 UTC m=+738.901303645" observedRunningTime="2025-10-07 13:19:27.833923954 +0000 UTC m=+739.319633099" watchObservedRunningTime="2025-10-07 13:19:27.840332158 +0000 UTC m=+739.326041283" Oct 07 13:19:28 crc kubenswrapper[4677]: I1007 13:19:28.831348 4677 generic.go:334] "Generic (PLEG): container finished" podID="c6f0a46e-3591-4f18-9ff7-867b546b2273" containerID="4368c7a60a4e9e20f78ca96923d6b54c345650a89cd3a78b574806f924d9c196" exitCode=0 Oct 07 13:19:28 crc kubenswrapper[4677]: I1007 13:19:28.832621 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerDied","Data":"4368c7a60a4e9e20f78ca96923d6b54c345650a89cd3a78b574806f924d9c196"} Oct 07 13:19:29 crc kubenswrapper[4677]: I1007 13:19:29.840778 4677 generic.go:334] "Generic (PLEG): container finished" podID="c6f0a46e-3591-4f18-9ff7-867b546b2273" containerID="93c8b4967d0b5e4153388b6be0dfe92b8047980e3e3892425ff29637075cea17" exitCode=0 Oct 07 13:19:29 crc kubenswrapper[4677]: I1007 13:19:29.840891 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerDied","Data":"93c8b4967d0b5e4153388b6be0dfe92b8047980e3e3892425ff29637075cea17"} Oct 07 13:19:30 crc kubenswrapper[4677]: I1007 13:19:30.051074 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-kmjvc" Oct 07 13:19:30 crc kubenswrapper[4677]: I1007 13:19:30.853717 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerStarted","Data":"50a72e1987469d2883462d9c1bc74ad90d2b9583ad596d3aaeb76eaec39ad29e"} Oct 07 13:19:30 crc kubenswrapper[4677]: I1007 13:19:30.853986 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerStarted","Data":"de3eebb6e48645b3facd113b80e12960e075eb09c433b06aa10dc52bed127802"} Oct 07 13:19:30 crc kubenswrapper[4677]: I1007 13:19:30.854001 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerStarted","Data":"d83de25670d225f70efe52b489a857b36a2ca37328c1f3a74e7489432417acb8"} Oct 07 13:19:30 crc kubenswrapper[4677]: I1007 13:19:30.854012 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerStarted","Data":"bb167b88b1e2b7733271d7dee5233238d43f0ef4c38d34953308e5720069e682"} Oct 07 13:19:30 crc kubenswrapper[4677]: I1007 13:19:30.854023 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerStarted","Data":"bde4611017965c2ec9572e3ccc8281e0018fa2ea599e7d40f8ea3d21bd3b41e4"} Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.533113 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-z6n69" Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.805342 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-n74hz"] Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.806144 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-n74hz" Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.813204 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-76qrb" Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.835407 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-n74hz"] Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.877680 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq5q5" event={"ID":"c6f0a46e-3591-4f18-9ff7-867b546b2273","Type":"ContainerStarted","Data":"80b3a14b92b193fa23da5fea71ffa34585b348e1a10562028e68a5dfc5275cdf"} Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.878154 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.908849 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hq5q5" podStartSLOduration=6.160867762 podStartE2EDuration="12.908832879s" podCreationTimestamp="2025-10-07 13:19:19 +0000 UTC" firstStartedPulling="2025-10-07 13:19:20.65012752 +0000 UTC m=+732.135836645" lastFinishedPulling="2025-10-07 13:19:27.398092637 +0000 UTC m=+738.883801762" observedRunningTime="2025-10-07 13:19:31.903487245 +0000 UTC m=+743.389196360" watchObservedRunningTime="2025-10-07 13:19:31.908832879 +0000 UTC m=+743.394541994" Oct 07 13:19:31 crc kubenswrapper[4677]: I1007 13:19:31.915221 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d29h\" (UniqueName: \"kubernetes.io/projected/af6b9f6e-0edf-4f47-824e-15ad1a549087-kube-api-access-4d29h\") pod \"infra-operator-index-n74hz\" (UID: \"af6b9f6e-0edf-4f47-824e-15ad1a549087\") " pod="openstack-operators/infra-operator-index-n74hz" Oct 07 13:19:32 crc kubenswrapper[4677]: I1007 13:19:32.016268 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d29h\" (UniqueName: \"kubernetes.io/projected/af6b9f6e-0edf-4f47-824e-15ad1a549087-kube-api-access-4d29h\") pod \"infra-operator-index-n74hz\" (UID: \"af6b9f6e-0edf-4f47-824e-15ad1a549087\") " pod="openstack-operators/infra-operator-index-n74hz" Oct 07 13:19:32 crc kubenswrapper[4677]: I1007 13:19:32.043589 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d29h\" (UniqueName: \"kubernetes.io/projected/af6b9f6e-0edf-4f47-824e-15ad1a549087-kube-api-access-4d29h\") pod \"infra-operator-index-n74hz\" (UID: \"af6b9f6e-0edf-4f47-824e-15ad1a549087\") " pod="openstack-operators/infra-operator-index-n74hz" Oct 07 13:19:32 crc kubenswrapper[4677]: I1007 13:19:32.125463 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-n74hz" Oct 07 13:19:32 crc kubenswrapper[4677]: I1007 13:19:32.382735 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-n74hz"] Oct 07 13:19:32 crc kubenswrapper[4677]: I1007 13:19:32.887209 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-n74hz" event={"ID":"af6b9f6e-0edf-4f47-824e-15ad1a549087","Type":"ContainerStarted","Data":"f76499805c7cb7b9b504bcde65328fbde3079e06557edd62385eff12933103c7"} Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.609627 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qk4fp"] Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.610079 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" podUID="807933d0-9a58-4191-9bde-74a00551f72e" containerName="controller-manager" containerID="cri-o://dfd09b2306f81fe19e582e11c9c4b3ce378d6088e025db2fee88df38a1cc3eca" gracePeriod=30 Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.697369 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx"] Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.697701 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" podUID="d5ca9be8-efe6-40e8-9f22-75f3e1644622" containerName="route-controller-manager" containerID="cri-o://0d554f3163dd5bc26cd3eb6d35f855cb3865e8157411ab777e4aefe05d75b98a" gracePeriod=30 Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.894374 4677 generic.go:334] "Generic (PLEG): container finished" podID="807933d0-9a58-4191-9bde-74a00551f72e" containerID="dfd09b2306f81fe19e582e11c9c4b3ce378d6088e025db2fee88df38a1cc3eca" exitCode=0 Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.894469 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" event={"ID":"807933d0-9a58-4191-9bde-74a00551f72e","Type":"ContainerDied","Data":"dfd09b2306f81fe19e582e11c9c4b3ce378d6088e025db2fee88df38a1cc3eca"} Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.900846 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-n74hz" event={"ID":"af6b9f6e-0edf-4f47-824e-15ad1a549087","Type":"ContainerStarted","Data":"48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74"} Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.904951 4677 generic.go:334] "Generic (PLEG): container finished" podID="d5ca9be8-efe6-40e8-9f22-75f3e1644622" containerID="0d554f3163dd5bc26cd3eb6d35f855cb3865e8157411ab777e4aefe05d75b98a" exitCode=0 Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.904994 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" event={"ID":"d5ca9be8-efe6-40e8-9f22-75f3e1644622","Type":"ContainerDied","Data":"0d554f3163dd5bc26cd3eb6d35f855cb3865e8157411ab777e4aefe05d75b98a"} Oct 07 13:19:33 crc kubenswrapper[4677]: I1007 13:19:33.923108 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-n74hz" podStartSLOduration=1.764429202 podStartE2EDuration="2.923085194s" podCreationTimestamp="2025-10-07 13:19:31 +0000 UTC" firstStartedPulling="2025-10-07 13:19:32.395303922 +0000 UTC m=+743.881013037" lastFinishedPulling="2025-10-07 13:19:33.553959904 +0000 UTC m=+745.039669029" observedRunningTime="2025-10-07 13:19:33.919265884 +0000 UTC m=+745.404974999" watchObservedRunningTime="2025-10-07 13:19:33.923085194 +0000 UTC m=+745.408794309" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.030277 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.090394 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142301 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-config\") pod \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142386 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwbdg\" (UniqueName: \"kubernetes.io/projected/807933d0-9a58-4191-9bde-74a00551f72e-kube-api-access-zwbdg\") pod \"807933d0-9a58-4191-9bde-74a00551f72e\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142419 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd4zw\" (UniqueName: \"kubernetes.io/projected/d5ca9be8-efe6-40e8-9f22-75f3e1644622-kube-api-access-nd4zw\") pod \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142452 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-client-ca\") pod \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142491 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-config\") pod \"807933d0-9a58-4191-9bde-74a00551f72e\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142539 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ca9be8-efe6-40e8-9f22-75f3e1644622-serving-cert\") pod \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\" (UID: \"d5ca9be8-efe6-40e8-9f22-75f3e1644622\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142562 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-client-ca\") pod \"807933d0-9a58-4191-9bde-74a00551f72e\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142577 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807933d0-9a58-4191-9bde-74a00551f72e-serving-cert\") pod \"807933d0-9a58-4191-9bde-74a00551f72e\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.142617 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-proxy-ca-bundles\") pod \"807933d0-9a58-4191-9bde-74a00551f72e\" (UID: \"807933d0-9a58-4191-9bde-74a00551f72e\") " Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.143509 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-client-ca" (OuterVolumeSpecName: "client-ca") pod "807933d0-9a58-4191-9bde-74a00551f72e" (UID: "807933d0-9a58-4191-9bde-74a00551f72e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.143626 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-config" (OuterVolumeSpecName: "config") pod "807933d0-9a58-4191-9bde-74a00551f72e" (UID: "807933d0-9a58-4191-9bde-74a00551f72e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.143633 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-config" (OuterVolumeSpecName: "config") pod "d5ca9be8-efe6-40e8-9f22-75f3e1644622" (UID: "d5ca9be8-efe6-40e8-9f22-75f3e1644622"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.143995 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "807933d0-9a58-4191-9bde-74a00551f72e" (UID: "807933d0-9a58-4191-9bde-74a00551f72e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.144125 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-client-ca" (OuterVolumeSpecName: "client-ca") pod "d5ca9be8-efe6-40e8-9f22-75f3e1644622" (UID: "d5ca9be8-efe6-40e8-9f22-75f3e1644622"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.144655 4677 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.144672 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.144681 4677 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.144690 4677 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/807933d0-9a58-4191-9bde-74a00551f72e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.144700 4677 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ca9be8-efe6-40e8-9f22-75f3e1644622-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.147822 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807933d0-9a58-4191-9bde-74a00551f72e-kube-api-access-zwbdg" (OuterVolumeSpecName: "kube-api-access-zwbdg") pod "807933d0-9a58-4191-9bde-74a00551f72e" (UID: "807933d0-9a58-4191-9bde-74a00551f72e"). InnerVolumeSpecName "kube-api-access-zwbdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.147902 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807933d0-9a58-4191-9bde-74a00551f72e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "807933d0-9a58-4191-9bde-74a00551f72e" (UID: "807933d0-9a58-4191-9bde-74a00551f72e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.147918 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ca9be8-efe6-40e8-9f22-75f3e1644622-kube-api-access-nd4zw" (OuterVolumeSpecName: "kube-api-access-nd4zw") pod "d5ca9be8-efe6-40e8-9f22-75f3e1644622" (UID: "d5ca9be8-efe6-40e8-9f22-75f3e1644622"). InnerVolumeSpecName "kube-api-access-nd4zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.148019 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca9be8-efe6-40e8-9f22-75f3e1644622-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d5ca9be8-efe6-40e8-9f22-75f3e1644622" (UID: "d5ca9be8-efe6-40e8-9f22-75f3e1644622"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.245898 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwbdg\" (UniqueName: \"kubernetes.io/projected/807933d0-9a58-4191-9bde-74a00551f72e-kube-api-access-zwbdg\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.245929 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd4zw\" (UniqueName: \"kubernetes.io/projected/d5ca9be8-efe6-40e8-9f22-75f3e1644622-kube-api-access-nd4zw\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.245939 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ca9be8-efe6-40e8-9f22-75f3e1644622-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.245950 4677 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807933d0-9a58-4191-9bde-74a00551f72e-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.775352 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-n74hz"] Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.913360 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" event={"ID":"d5ca9be8-efe6-40e8-9f22-75f3e1644622","Type":"ContainerDied","Data":"2a6f7fafebf1b1b972b32ee45cc3cfd824d9f284e729b99b23b492b882ba639b"} Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.913387 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.913468 4677 scope.go:117] "RemoveContainer" containerID="0d554f3163dd5bc26cd3eb6d35f855cb3865e8157411ab777e4aefe05d75b98a" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.916919 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.916920 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qk4fp" event={"ID":"807933d0-9a58-4191-9bde-74a00551f72e","Type":"ContainerDied","Data":"2a7b3dc8f957709c071215c44cf39ae96c90fa479e1853ec57829df5e0cbd65b"} Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.947498 4677 scope.go:117] "RemoveContainer" containerID="dfd09b2306f81fe19e582e11c9c4b3ce378d6088e025db2fee88df38a1cc3eca" Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.950156 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qk4fp"] Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.963543 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qk4fp"] Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.973044 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx"] Oct 07 13:19:34 crc kubenswrapper[4677]: I1007 13:19:34.982712 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rfpfx"] Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.315898 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807933d0-9a58-4191-9bde-74a00551f72e" path="/var/lib/kubelet/pods/807933d0-9a58-4191-9bde-74a00551f72e/volumes" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.317200 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ca9be8-efe6-40e8-9f22-75f3e1644622" path="/var/lib/kubelet/pods/d5ca9be8-efe6-40e8-9f22-75f3e1644622/volumes" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.379786 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-mjjn9"] Oct 07 13:19:35 crc kubenswrapper[4677]: E1007 13:19:35.380089 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807933d0-9a58-4191-9bde-74a00551f72e" containerName="controller-manager" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.380109 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="807933d0-9a58-4191-9bde-74a00551f72e" containerName="controller-manager" Oct 07 13:19:35 crc kubenswrapper[4677]: E1007 13:19:35.380142 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ca9be8-efe6-40e8-9f22-75f3e1644622" containerName="route-controller-manager" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.380155 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ca9be8-efe6-40e8-9f22-75f3e1644622" containerName="route-controller-manager" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.380348 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ca9be8-efe6-40e8-9f22-75f3e1644622" containerName="route-controller-manager" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.380366 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="807933d0-9a58-4191-9bde-74a00551f72e" containerName="controller-manager" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.380975 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.398708 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mjjn9"] Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.463827 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gss66\" (UniqueName: \"kubernetes.io/projected/3d440c3d-e4d7-4b91-87e7-0f73d587d638-kube-api-access-gss66\") pod \"infra-operator-index-mjjn9\" (UID: \"3d440c3d-e4d7-4b91-87e7-0f73d587d638\") " pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.497852 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq"] Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.498747 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.500292 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.501165 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.501792 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.501865 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.501903 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.503452 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.504041 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d567955c-bgtfp"] Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.505174 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.507308 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.508490 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.508907 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.508941 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.509168 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.516742 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.519887 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.520990 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d567955c-bgtfp"] Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.523700 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq"] Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.532646 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.566733 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6cd\" (UniqueName: \"kubernetes.io/projected/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-kube-api-access-zk6cd\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.566775 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51aba406-c61c-4eff-a44c-6c34cfc1f30a-client-ca\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.566800 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-client-ca\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.566918 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51aba406-c61c-4eff-a44c-6c34cfc1f30a-serving-cert\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.566966 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51aba406-c61c-4eff-a44c-6c34cfc1f30a-config\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.567012 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4c4x\" (UniqueName: \"kubernetes.io/projected/51aba406-c61c-4eff-a44c-6c34cfc1f30a-kube-api-access-q4c4x\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.567052 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-proxy-ca-bundles\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.567105 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-config\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.567142 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-serving-cert\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.567236 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gss66\" (UniqueName: \"kubernetes.io/projected/3d440c3d-e4d7-4b91-87e7-0f73d587d638-kube-api-access-gss66\") pod \"infra-operator-index-mjjn9\" (UID: \"3d440c3d-e4d7-4b91-87e7-0f73d587d638\") " pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.573534 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.586302 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gss66\" (UniqueName: \"kubernetes.io/projected/3d440c3d-e4d7-4b91-87e7-0f73d587d638-kube-api-access-gss66\") pod \"infra-operator-index-mjjn9\" (UID: \"3d440c3d-e4d7-4b91-87e7-0f73d587d638\") " pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.667968 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4c4x\" (UniqueName: \"kubernetes.io/projected/51aba406-c61c-4eff-a44c-6c34cfc1f30a-kube-api-access-q4c4x\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.668009 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-config\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.668024 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-proxy-ca-bundles\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.668043 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-serving-cert\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.668080 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6cd\" (UniqueName: \"kubernetes.io/projected/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-kube-api-access-zk6cd\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.668100 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51aba406-c61c-4eff-a44c-6c34cfc1f30a-client-ca\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.668122 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-client-ca\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.669043 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51aba406-c61c-4eff-a44c-6c34cfc1f30a-client-ca\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.669131 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-client-ca\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.669137 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51aba406-c61c-4eff-a44c-6c34cfc1f30a-serving-cert\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.669205 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51aba406-c61c-4eff-a44c-6c34cfc1f30a-config\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.669507 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-proxy-ca-bundles\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.669816 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-config\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.670035 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51aba406-c61c-4eff-a44c-6c34cfc1f30a-config\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.672144 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-serving-cert\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.673364 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51aba406-c61c-4eff-a44c-6c34cfc1f30a-serving-cert\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.690463 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6cd\" (UniqueName: \"kubernetes.io/projected/368bcca7-5355-4f78-bd58-6b6a27f7b4fd-kube-api-access-zk6cd\") pod \"controller-manager-d567955c-bgtfp\" (UID: \"368bcca7-5355-4f78-bd58-6b6a27f7b4fd\") " pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.695993 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4c4x\" (UniqueName: \"kubernetes.io/projected/51aba406-c61c-4eff-a44c-6c34cfc1f30a-kube-api-access-q4c4x\") pod \"route-controller-manager-8899f86cc-6gwkq\" (UID: \"51aba406-c61c-4eff-a44c-6c34cfc1f30a\") " pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.704938 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.815735 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.823645 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.924886 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mjjn9"] Oct 07 13:19:35 crc kubenswrapper[4677]: I1007 13:19:35.951332 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-n74hz" podUID="af6b9f6e-0edf-4f47-824e-15ad1a549087" containerName="registry-server" containerID="cri-o://48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74" gracePeriod=2 Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.248039 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-n74hz" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.268797 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq"] Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.278933 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d29h\" (UniqueName: \"kubernetes.io/projected/af6b9f6e-0edf-4f47-824e-15ad1a549087-kube-api-access-4d29h\") pod \"af6b9f6e-0edf-4f47-824e-15ad1a549087\" (UID: \"af6b9f6e-0edf-4f47-824e-15ad1a549087\") " Oct 07 13:19:36 crc kubenswrapper[4677]: W1007 13:19:36.280655 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51aba406_c61c_4eff_a44c_6c34cfc1f30a.slice/crio-c86223c2e92aac9f9e535c56bff8df8a2d09fe1233f37775f357d5de87ebe7b1 WatchSource:0}: Error finding container c86223c2e92aac9f9e535c56bff8df8a2d09fe1233f37775f357d5de87ebe7b1: Status 404 returned error can't find the container with id c86223c2e92aac9f9e535c56bff8df8a2d09fe1233f37775f357d5de87ebe7b1 Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.283323 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6b9f6e-0edf-4f47-824e-15ad1a549087-kube-api-access-4d29h" (OuterVolumeSpecName: "kube-api-access-4d29h") pod "af6b9f6e-0edf-4f47-824e-15ad1a549087" (UID: "af6b9f6e-0edf-4f47-824e-15ad1a549087"). InnerVolumeSpecName "kube-api-access-4d29h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.321919 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d567955c-bgtfp"] Oct 07 13:19:36 crc kubenswrapper[4677]: W1007 13:19:36.334530 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod368bcca7_5355_4f78_bd58_6b6a27f7b4fd.slice/crio-8aa7d6d7c4fc5ed4250ea4503ae767ff6d129bd9bd073b893b738144a093359b WatchSource:0}: Error finding container 8aa7d6d7c4fc5ed4250ea4503ae767ff6d129bd9bd073b893b738144a093359b: Status 404 returned error can't find the container with id 8aa7d6d7c4fc5ed4250ea4503ae767ff6d129bd9bd073b893b738144a093359b Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.382395 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d29h\" (UniqueName: \"kubernetes.io/projected/af6b9f6e-0edf-4f47-824e-15ad1a549087-kube-api-access-4d29h\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.958266 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" event={"ID":"368bcca7-5355-4f78-bd58-6b6a27f7b4fd","Type":"ContainerStarted","Data":"28fd5fc67e70f2c0788c36c96dc5a2cc41b03172a2cfa419d6129948b3d38225"} Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.958528 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.958540 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" event={"ID":"368bcca7-5355-4f78-bd58-6b6a27f7b4fd","Type":"ContainerStarted","Data":"8aa7d6d7c4fc5ed4250ea4503ae767ff6d129bd9bd073b893b738144a093359b"} Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.959320 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjjn9" event={"ID":"3d440c3d-e4d7-4b91-87e7-0f73d587d638","Type":"ContainerStarted","Data":"c068648479cf839709652c29f903e53bc9e5c7ca1e9f657457acb461ff14b994"} Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.960818 4677 generic.go:334] "Generic (PLEG): container finished" podID="af6b9f6e-0edf-4f47-824e-15ad1a549087" containerID="48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74" exitCode=0 Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.960869 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-n74hz" event={"ID":"af6b9f6e-0edf-4f47-824e-15ad1a549087","Type":"ContainerDied","Data":"48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74"} Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.960871 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-n74hz" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.960890 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-n74hz" event={"ID":"af6b9f6e-0edf-4f47-824e-15ad1a549087","Type":"ContainerDied","Data":"f76499805c7cb7b9b504bcde65328fbde3079e06557edd62385eff12933103c7"} Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.960906 4677 scope.go:117] "RemoveContainer" containerID="48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.966879 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" event={"ID":"51aba406-c61c-4eff-a44c-6c34cfc1f30a","Type":"ContainerStarted","Data":"58381edf7c3d4311e26332d97ac4f598a179a136f02735b81de3c5ca490224af"} Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.966912 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" event={"ID":"51aba406-c61c-4eff-a44c-6c34cfc1f30a","Type":"ContainerStarted","Data":"c86223c2e92aac9f9e535c56bff8df8a2d09fe1233f37775f357d5de87ebe7b1"} Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.967620 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.982069 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" podStartSLOduration=3.982047798 podStartE2EDuration="3.982047798s" podCreationTimestamp="2025-10-07 13:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:36.979720391 +0000 UTC m=+748.465429506" watchObservedRunningTime="2025-10-07 13:19:36.982047798 +0000 UTC m=+748.467756903" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.987773 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d567955c-bgtfp" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.987876 4677 scope.go:117] "RemoveContainer" containerID="48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74" Oct 07 13:19:36 crc kubenswrapper[4677]: E1007 13:19:36.988239 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74\": container with ID starting with 48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74 not found: ID does not exist" containerID="48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74" Oct 07 13:19:36 crc kubenswrapper[4677]: I1007 13:19:36.988270 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74"} err="failed to get container status \"48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74\": rpc error: code = NotFound desc = could not find container \"48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74\": container with ID starting with 48971a18e1c23cd095fabcb4d44b250a287070d9617038263e33526fab036c74 not found: ID does not exist" Oct 07 13:19:37 crc kubenswrapper[4677]: I1007 13:19:37.001898 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" podStartSLOduration=4.001884688 podStartE2EDuration="4.001884688s" podCreationTimestamp="2025-10-07 13:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:36.998779999 +0000 UTC m=+748.484489114" watchObservedRunningTime="2025-10-07 13:19:37.001884688 +0000 UTC m=+748.487593803" Oct 07 13:19:37 crc kubenswrapper[4677]: I1007 13:19:37.028553 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-n74hz"] Oct 07 13:19:37 crc kubenswrapper[4677]: I1007 13:19:37.031578 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-n74hz"] Oct 07 13:19:37 crc kubenswrapper[4677]: I1007 13:19:37.057250 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8899f86cc-6gwkq" Oct 07 13:19:37 crc kubenswrapper[4677]: I1007 13:19:37.310976 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af6b9f6e-0edf-4f47-824e-15ad1a549087" path="/var/lib/kubelet/pods/af6b9f6e-0edf-4f47-824e-15ad1a549087/volumes" Oct 07 13:19:37 crc kubenswrapper[4677]: I1007 13:19:37.979290 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjjn9" event={"ID":"3d440c3d-e4d7-4b91-87e7-0f73d587d638","Type":"ContainerStarted","Data":"108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703"} Oct 07 13:19:38 crc kubenswrapper[4677]: I1007 13:19:38.013219 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-mjjn9" podStartSLOduration=2.081843277 podStartE2EDuration="3.013187366s" podCreationTimestamp="2025-10-07 13:19:35 +0000 UTC" firstStartedPulling="2025-10-07 13:19:35.943593 +0000 UTC m=+747.429302115" lastFinishedPulling="2025-10-07 13:19:36.874937089 +0000 UTC m=+748.360646204" observedRunningTime="2025-10-07 13:19:38.005523906 +0000 UTC m=+749.491233071" watchObservedRunningTime="2025-10-07 13:19:38.013187366 +0000 UTC m=+749.498896541" Oct 07 13:19:39 crc kubenswrapper[4677]: I1007 13:19:39.932890 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-g8px2" Oct 07 13:19:40 crc kubenswrapper[4677]: I1007 13:19:40.536616 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hq5q5" Oct 07 13:19:45 crc kubenswrapper[4677]: I1007 13:19:45.365472 4677 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:19:45 crc kubenswrapper[4677]: I1007 13:19:45.706328 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:45 crc kubenswrapper[4677]: I1007 13:19:45.706691 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:45 crc kubenswrapper[4677]: I1007 13:19:45.738115 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:46 crc kubenswrapper[4677]: I1007 13:19:46.085113 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.415205 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r"] Oct 07 13:19:47 crc kubenswrapper[4677]: E1007 13:19:47.415467 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6b9f6e-0edf-4f47-824e-15ad1a549087" containerName="registry-server" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.415480 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6b9f6e-0edf-4f47-824e-15ad1a549087" containerName="registry-server" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.416053 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6b9f6e-0edf-4f47-824e-15ad1a549087" containerName="registry-server" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.416986 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.419002 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-krhf2" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.432247 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r"] Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.468731 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb5qk\" (UniqueName: \"kubernetes.io/projected/0182d27a-0a19-4496-a514-75b6ca93d5b7-kube-api-access-bb5qk\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.468788 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-bundle\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.468828 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-util\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.569881 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb5qk\" (UniqueName: \"kubernetes.io/projected/0182d27a-0a19-4496-a514-75b6ca93d5b7-kube-api-access-bb5qk\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.569949 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-bundle\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.569990 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-util\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.570483 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-bundle\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.570601 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-util\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.595852 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb5qk\" (UniqueName: \"kubernetes.io/projected/0182d27a-0a19-4496-a514-75b6ca93d5b7-kube-api-access-bb5qk\") pod \"4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:47 crc kubenswrapper[4677]: I1007 13:19:47.732349 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:48 crc kubenswrapper[4677]: I1007 13:19:48.244249 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r"] Oct 07 13:19:48 crc kubenswrapper[4677]: W1007 13:19:48.252494 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0182d27a_0a19_4496_a514_75b6ca93d5b7.slice/crio-2f8baf2296125f49004bff193f58a945c91668721abe38fdcf71fe20d43f9792 WatchSource:0}: Error finding container 2f8baf2296125f49004bff193f58a945c91668721abe38fdcf71fe20d43f9792: Status 404 returned error can't find the container with id 2f8baf2296125f49004bff193f58a945c91668721abe38fdcf71fe20d43f9792 Oct 07 13:19:49 crc kubenswrapper[4677]: I1007 13:19:49.068283 4677 generic.go:334] "Generic (PLEG): container finished" podID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerID="92010fddc8e22ad080f25639af06d5e5bd9194ff47a8120801fed1c37f956f0d" exitCode=0 Oct 07 13:19:49 crc kubenswrapper[4677]: I1007 13:19:49.068348 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" event={"ID":"0182d27a-0a19-4496-a514-75b6ca93d5b7","Type":"ContainerDied","Data":"92010fddc8e22ad080f25639af06d5e5bd9194ff47a8120801fed1c37f956f0d"} Oct 07 13:19:49 crc kubenswrapper[4677]: I1007 13:19:49.068403 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" event={"ID":"0182d27a-0a19-4496-a514-75b6ca93d5b7","Type":"ContainerStarted","Data":"2f8baf2296125f49004bff193f58a945c91668721abe38fdcf71fe20d43f9792"} Oct 07 13:19:51 crc kubenswrapper[4677]: I1007 13:19:51.093654 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" event={"ID":"0182d27a-0a19-4496-a514-75b6ca93d5b7","Type":"ContainerStarted","Data":"17f2f6c42ca454551c29ad8f0e31600ecdd63b1c1cc2d590b728582faacdad31"} Oct 07 13:19:52 crc kubenswrapper[4677]: I1007 13:19:52.126721 4677 generic.go:334] "Generic (PLEG): container finished" podID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerID="17f2f6c42ca454551c29ad8f0e31600ecdd63b1c1cc2d590b728582faacdad31" exitCode=0 Oct 07 13:19:52 crc kubenswrapper[4677]: I1007 13:19:52.127064 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" event={"ID":"0182d27a-0a19-4496-a514-75b6ca93d5b7","Type":"ContainerDied","Data":"17f2f6c42ca454551c29ad8f0e31600ecdd63b1c1cc2d590b728582faacdad31"} Oct 07 13:19:53 crc kubenswrapper[4677]: I1007 13:19:53.138359 4677 generic.go:334] "Generic (PLEG): container finished" podID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerID="0375e1d9f3b6c4bb3acd12bcbc377800984e5d01f9cf5aeb7ce6f626e0654505" exitCode=0 Oct 07 13:19:53 crc kubenswrapper[4677]: I1007 13:19:53.138515 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" event={"ID":"0182d27a-0a19-4496-a514-75b6ca93d5b7","Type":"ContainerDied","Data":"0375e1d9f3b6c4bb3acd12bcbc377800984e5d01f9cf5aeb7ce6f626e0654505"} Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.613485 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.671793 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-bundle\") pod \"0182d27a-0a19-4496-a514-75b6ca93d5b7\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.671866 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb5qk\" (UniqueName: \"kubernetes.io/projected/0182d27a-0a19-4496-a514-75b6ca93d5b7-kube-api-access-bb5qk\") pod \"0182d27a-0a19-4496-a514-75b6ca93d5b7\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.671953 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-util\") pod \"0182d27a-0a19-4496-a514-75b6ca93d5b7\" (UID: \"0182d27a-0a19-4496-a514-75b6ca93d5b7\") " Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.673382 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-bundle" (OuterVolumeSpecName: "bundle") pod "0182d27a-0a19-4496-a514-75b6ca93d5b7" (UID: "0182d27a-0a19-4496-a514-75b6ca93d5b7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.678981 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0182d27a-0a19-4496-a514-75b6ca93d5b7-kube-api-access-bb5qk" (OuterVolumeSpecName: "kube-api-access-bb5qk") pod "0182d27a-0a19-4496-a514-75b6ca93d5b7" (UID: "0182d27a-0a19-4496-a514-75b6ca93d5b7"). InnerVolumeSpecName "kube-api-access-bb5qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.685010 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-util" (OuterVolumeSpecName: "util") pod "0182d27a-0a19-4496-a514-75b6ca93d5b7" (UID: "0182d27a-0a19-4496-a514-75b6ca93d5b7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.773248 4677 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.773296 4677 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0182d27a-0a19-4496-a514-75b6ca93d5b7-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:54 crc kubenswrapper[4677]: I1007 13:19:54.773316 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb5qk\" (UniqueName: \"kubernetes.io/projected/0182d27a-0a19-4496-a514-75b6ca93d5b7-kube-api-access-bb5qk\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:55 crc kubenswrapper[4677]: I1007 13:19:55.155160 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" event={"ID":"0182d27a-0a19-4496-a514-75b6ca93d5b7","Type":"ContainerDied","Data":"2f8baf2296125f49004bff193f58a945c91668721abe38fdcf71fe20d43f9792"} Oct 07 13:19:55 crc kubenswrapper[4677]: I1007 13:19:55.155206 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8baf2296125f49004bff193f58a945c91668721abe38fdcf71fe20d43f9792" Oct 07 13:19:55 crc kubenswrapper[4677]: I1007 13:19:55.155286 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.803476 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4"] Oct 07 13:20:00 crc kubenswrapper[4677]: E1007 13:20:00.804961 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerName="extract" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.805116 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerName="extract" Oct 07 13:20:00 crc kubenswrapper[4677]: E1007 13:20:00.805245 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerName="pull" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.805355 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerName="pull" Oct 07 13:20:00 crc kubenswrapper[4677]: E1007 13:20:00.805510 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerName="util" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.805614 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerName="util" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.805876 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" containerName="extract" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.807027 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.809239 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.809390 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-n4xnl" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.832019 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4"] Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.857380 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-webhook-cert\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.857463 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-apiservice-cert\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.857500 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4kxd\" (UniqueName: \"kubernetes.io/projected/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-kube-api-access-f4kxd\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.958745 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-webhook-cert\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.958803 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-apiservice-cert\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.958837 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4kxd\" (UniqueName: \"kubernetes.io/projected/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-kube-api-access-f4kxd\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.964973 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-webhook-cert\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.967120 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-apiservice-cert\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:00 crc kubenswrapper[4677]: I1007 13:20:00.978727 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4kxd\" (UniqueName: \"kubernetes.io/projected/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-kube-api-access-f4kxd\") pod \"infra-operator-controller-manager-586b5ff777-4p7n4\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:01 crc kubenswrapper[4677]: I1007 13:20:01.132046 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:01 crc kubenswrapper[4677]: I1007 13:20:01.576761 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4"] Oct 07 13:20:01 crc kubenswrapper[4677]: W1007 13:20:01.589641 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f6efbc_5f7a_48b6_ae09_35afb8f8dd7e.slice/crio-3ef030ba7124d69940a36cebcd43162e1cc6736ed714d1c9e2311a8803556b32 WatchSource:0}: Error finding container 3ef030ba7124d69940a36cebcd43162e1cc6736ed714d1c9e2311a8803556b32: Status 404 returned error can't find the container with id 3ef030ba7124d69940a36cebcd43162e1cc6736ed714d1c9e2311a8803556b32 Oct 07 13:20:02 crc kubenswrapper[4677]: I1007 13:20:02.218127 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" event={"ID":"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e","Type":"ContainerStarted","Data":"3ef030ba7124d69940a36cebcd43162e1cc6736ed714d1c9e2311a8803556b32"} Oct 07 13:20:04 crc kubenswrapper[4677]: I1007 13:20:04.241627 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" event={"ID":"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e","Type":"ContainerStarted","Data":"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a"} Oct 07 13:20:04 crc kubenswrapper[4677]: I1007 13:20:04.242181 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:04 crc kubenswrapper[4677]: I1007 13:20:04.242199 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" event={"ID":"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e","Type":"ContainerStarted","Data":"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1"} Oct 07 13:20:04 crc kubenswrapper[4677]: I1007 13:20:04.275644 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" podStartSLOduration=2.42170329 podStartE2EDuration="4.275613657s" podCreationTimestamp="2025-10-07 13:20:00 +0000 UTC" firstStartedPulling="2025-10-07 13:20:01.592512507 +0000 UTC m=+773.078221622" lastFinishedPulling="2025-10-07 13:20:03.446422874 +0000 UTC m=+774.932131989" observedRunningTime="2025-10-07 13:20:04.266692711 +0000 UTC m=+775.752401866" watchObservedRunningTime="2025-10-07 13:20:04.275613657 +0000 UTC m=+775.761322812" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.798751 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.800025 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.805835 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.808026 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.811805 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openshift-service-ca.crt" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.812143 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-scripts" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.812295 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.813377 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.814187 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.814270 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config-data" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.814320 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"galera-openstack-dockercfg-6nnh2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.814493 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"kube-root-ca.crt" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.867415 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.874980 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.879828 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950592 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-secrets\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950639 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950663 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950688 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950717 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950737 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950752 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-default\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950767 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kolla-config\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950786 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-secrets\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950810 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950827 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kolla-config\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950842 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-default\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950860 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950876 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-secrets\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950905 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950920 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950938 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbvqj\" (UniqueName: \"kubernetes.io/projected/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kube-api-access-zbvqj\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950959 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rt5v\" (UniqueName: \"kubernetes.io/projected/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kube-api-access-7rt5v\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950979 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-default\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.950993 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kube-api-access-lsqp4\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:07 crc kubenswrapper[4677]: I1007 13:20:07.951008 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kolla-config\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.051867 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-secrets\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.051948 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.051982 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052008 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052033 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052057 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052077 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-default\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052101 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kolla-config\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052126 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-secrets\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052158 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052181 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kolla-config\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052205 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-default\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052232 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052256 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-secrets\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052295 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052318 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052339 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbvqj\" (UniqueName: \"kubernetes.io/projected/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kube-api-access-zbvqj\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052373 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rt5v\" (UniqueName: \"kubernetes.io/projected/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kube-api-access-7rt5v\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052398 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-default\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052418 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kube-api-access-lsqp4\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.052457 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kolla-config\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.053399 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kolla-config\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.053979 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kolla-config\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.054102 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-default\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.054329 4677 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") device mount path \"/mnt/openstack/pv07\"" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.054346 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.054730 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.055622 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kolla-config\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.055676 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-default\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.056183 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.056186 4677 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") device mount path \"/mnt/openstack/pv03\"" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.056479 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.056514 4677 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") device mount path \"/mnt/openstack/pv04\"" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.056764 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.057004 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-default\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.057889 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.060402 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-secrets\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.065085 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-secrets\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.067813 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-secrets\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.079717 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rt5v\" (UniqueName: \"kubernetes.io/projected/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kube-api-access-7rt5v\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.080320 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kube-api-access-lsqp4\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.081067 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbvqj\" (UniqueName: \"kubernetes.io/projected/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kube-api-access-zbvqj\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.081245 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.085973 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.089930 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.120903 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.173665 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.185056 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.415953 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 07 13:20:08 crc kubenswrapper[4677]: W1007 13:20:08.421720 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ec4d41c_e5c7_49c2_be5b_2b228d1ddcb8.slice/crio-eb154f906d6745cfc3c33141794e849a19ab7ae624714e5e93fe2191a9dc5e0e WatchSource:0}: Error finding container eb154f906d6745cfc3c33141794e849a19ab7ae624714e5e93fe2191a9dc5e0e: Status 404 returned error can't find the container with id eb154f906d6745cfc3c33141794e849a19ab7ae624714e5e93fe2191a9dc5e0e Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.499905 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 07 13:20:08 crc kubenswrapper[4677]: W1007 13:20:08.506326 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5b66ae_3d1d_4299_bfe7_d3f3eb705177.slice/crio-b5bb87dc83ab68feab18d3b344507b1f98c1608cb2e1c961e06ad1a1a3d42bae WatchSource:0}: Error finding container b5bb87dc83ab68feab18d3b344507b1f98c1608cb2e1c961e06ad1a1a3d42bae: Status 404 returned error can't find the container with id b5bb87dc83ab68feab18d3b344507b1f98c1608cb2e1c961e06ad1a1a3d42bae Oct 07 13:20:08 crc kubenswrapper[4677]: W1007 13:20:08.677027 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b168b6a_df61_44ed_8a09_ed30d3ecc2ea.slice/crio-acb4fb6a91db044ecddfe1cc74d6d29398ed860806fc3c2b7742bc4ee71d5da7 WatchSource:0}: Error finding container acb4fb6a91db044ecddfe1cc74d6d29398ed860806fc3c2b7742bc4ee71d5da7: Status 404 returned error can't find the container with id acb4fb6a91db044ecddfe1cc74d6d29398ed860806fc3c2b7742bc4ee71d5da7 Oct 07 13:20:08 crc kubenswrapper[4677]: I1007 13:20:08.677668 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 07 13:20:09 crc kubenswrapper[4677]: I1007 13:20:09.271759 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177","Type":"ContainerStarted","Data":"b5bb87dc83ab68feab18d3b344507b1f98c1608cb2e1c961e06ad1a1a3d42bae"} Oct 07 13:20:09 crc kubenswrapper[4677]: I1007 13:20:09.272728 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea","Type":"ContainerStarted","Data":"acb4fb6a91db044ecddfe1cc74d6d29398ed860806fc3c2b7742bc4ee71d5da7"} Oct 07 13:20:09 crc kubenswrapper[4677]: I1007 13:20:09.273830 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8","Type":"ContainerStarted","Data":"eb154f906d6745cfc3c33141794e849a19ab7ae624714e5e93fe2191a9dc5e0e"} Oct 07 13:20:10 crc kubenswrapper[4677]: I1007 13:20:10.917592 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:20:10 crc kubenswrapper[4677]: I1007 13:20:10.917951 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:20:11 crc kubenswrapper[4677]: I1007 13:20:11.136870 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:20:18 crc kubenswrapper[4677]: I1007 13:20:18.363892 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8","Type":"ContainerStarted","Data":"9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246"} Oct 07 13:20:18 crc kubenswrapper[4677]: I1007 13:20:18.366554 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177","Type":"ContainerStarted","Data":"fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160"} Oct 07 13:20:18 crc kubenswrapper[4677]: I1007 13:20:18.367980 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea","Type":"ContainerStarted","Data":"7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96"} Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.585690 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pt6z2"] Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.586583 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.589098 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-2m97g" Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.597599 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pt6z2"] Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.731617 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cthgg\" (UniqueName: \"kubernetes.io/projected/1a451150-fba9-465e-aab2-5d71965a674d-kube-api-access-cthgg\") pod \"rabbitmq-cluster-operator-index-pt6z2\" (UID: \"1a451150-fba9-465e-aab2-5d71965a674d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.832759 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cthgg\" (UniqueName: \"kubernetes.io/projected/1a451150-fba9-465e-aab2-5d71965a674d-kube-api-access-cthgg\") pod \"rabbitmq-cluster-operator-index-pt6z2\" (UID: \"1a451150-fba9-465e-aab2-5d71965a674d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.851470 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cthgg\" (UniqueName: \"kubernetes.io/projected/1a451150-fba9-465e-aab2-5d71965a674d-kube-api-access-cthgg\") pod \"rabbitmq-cluster-operator-index-pt6z2\" (UID: \"1a451150-fba9-465e-aab2-5d71965a674d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" Oct 07 13:20:19 crc kubenswrapper[4677]: I1007 13:20:19.910360 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" Oct 07 13:20:20 crc kubenswrapper[4677]: I1007 13:20:20.389814 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pt6z2"] Oct 07 13:20:21 crc kubenswrapper[4677]: I1007 13:20:21.395394 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" event={"ID":"1a451150-fba9-465e-aab2-5d71965a674d","Type":"ContainerStarted","Data":"1c39a5b4a94c9077a96705488901ace61536d08b049fad387d3ba903bfb53070"} Oct 07 13:20:23 crc kubenswrapper[4677]: I1007 13:20:23.773955 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pt6z2"] Oct 07 13:20:24 crc kubenswrapper[4677]: I1007 13:20:24.379300 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-cxhtv"] Oct 07 13:20:24 crc kubenswrapper[4677]: I1007 13:20:24.380036 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:24 crc kubenswrapper[4677]: I1007 13:20:24.393334 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-cxhtv"] Oct 07 13:20:24 crc kubenswrapper[4677]: I1007 13:20:24.505752 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2fn\" (UniqueName: \"kubernetes.io/projected/b24052e2-c147-452e-bc01-0970fe195485-kube-api-access-rr2fn\") pod \"rabbitmq-cluster-operator-index-cxhtv\" (UID: \"b24052e2-c147-452e-bc01-0970fe195485\") " pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:24 crc kubenswrapper[4677]: I1007 13:20:24.607151 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2fn\" (UniqueName: \"kubernetes.io/projected/b24052e2-c147-452e-bc01-0970fe195485-kube-api-access-rr2fn\") pod \"rabbitmq-cluster-operator-index-cxhtv\" (UID: \"b24052e2-c147-452e-bc01-0970fe195485\") " pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:24 crc kubenswrapper[4677]: I1007 13:20:24.642076 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2fn\" (UniqueName: \"kubernetes.io/projected/b24052e2-c147-452e-bc01-0970fe195485-kube-api-access-rr2fn\") pod \"rabbitmq-cluster-operator-index-cxhtv\" (UID: \"b24052e2-c147-452e-bc01-0970fe195485\") " pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:24 crc kubenswrapper[4677]: I1007 13:20:24.694153 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:25 crc kubenswrapper[4677]: I1007 13:20:25.744070 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-cxhtv"] Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.435187 4677 generic.go:334] "Generic (PLEG): container finished" podID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerID="7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96" exitCode=0 Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.435308 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea","Type":"ContainerDied","Data":"7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96"} Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.440051 4677 generic.go:334] "Generic (PLEG): container finished" podID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerID="9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246" exitCode=0 Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.440183 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8","Type":"ContainerDied","Data":"9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246"} Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.443381 4677 generic.go:334] "Generic (PLEG): container finished" podID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerID="fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160" exitCode=0 Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.443471 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177","Type":"ContainerDied","Data":"fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160"} Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.942021 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.943591 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.946524 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"memcached-memcached-dockercfg-5nl9v" Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.953635 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"memcached-config-data" Oct 07 13:20:26 crc kubenswrapper[4677]: I1007 13:20:26.955462 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.047964 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-config-data\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.048076 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gdc9\" (UniqueName: \"kubernetes.io/projected/a8dabf1b-2f0f-4f7f-8342-31001928330b-kube-api-access-4gdc9\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.048393 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-kolla-config\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.150185 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-config-data\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.150221 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gdc9\" (UniqueName: \"kubernetes.io/projected/a8dabf1b-2f0f-4f7f-8342-31001928330b-kube-api-access-4gdc9\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.150301 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-kolla-config\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.151001 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-kolla-config\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.151054 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-config-data\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.172519 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gdc9\" (UniqueName: \"kubernetes.io/projected/a8dabf1b-2f0f-4f7f-8342-31001928330b-kube-api-access-4gdc9\") pod \"memcached-0\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.257520 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.452967 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" event={"ID":"1a451150-fba9-465e-aab2-5d71965a674d","Type":"ContainerStarted","Data":"9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6"} Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.453013 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" podUID="1a451150-fba9-465e-aab2-5d71965a674d" containerName="registry-server" containerID="cri-o://9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6" gracePeriod=2 Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.462425 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177","Type":"ContainerStarted","Data":"6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9"} Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.467880 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" podStartSLOduration=2.242951182 podStartE2EDuration="8.467862327s" podCreationTimestamp="2025-10-07 13:20:19 +0000 UTC" firstStartedPulling="2025-10-07 13:20:20.398667217 +0000 UTC m=+791.884376362" lastFinishedPulling="2025-10-07 13:20:26.623578392 +0000 UTC m=+798.109287507" observedRunningTime="2025-10-07 13:20:27.465654864 +0000 UTC m=+798.951363989" watchObservedRunningTime="2025-10-07 13:20:27.467862327 +0000 UTC m=+798.953571442" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.480856 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea","Type":"ContainerStarted","Data":"1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362"} Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.482298 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8","Type":"ContainerStarted","Data":"86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de"} Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.484030 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" event={"ID":"b24052e2-c147-452e-bc01-0970fe195485","Type":"ContainerStarted","Data":"86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80"} Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.484057 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" event={"ID":"b24052e2-c147-452e-bc01-0970fe195485","Type":"ContainerStarted","Data":"072eebb6c0a61c0d10ce4c2190a3f199c8347a203836b8e73a31fccd53380c8b"} Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.501417 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-0" podStartSLOduration=12.774150955 podStartE2EDuration="21.501130073s" podCreationTimestamp="2025-10-07 13:20:06 +0000 UTC" firstStartedPulling="2025-10-07 13:20:08.508969453 +0000 UTC m=+779.994678568" lastFinishedPulling="2025-10-07 13:20:17.235948571 +0000 UTC m=+788.721657686" observedRunningTime="2025-10-07 13:20:27.495328396 +0000 UTC m=+798.981037521" watchObservedRunningTime="2025-10-07 13:20:27.501130073 +0000 UTC m=+798.986839188" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.519465 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-2" podStartSLOduration=12.838620177 podStartE2EDuration="21.519448059s" podCreationTimestamp="2025-10-07 13:20:06 +0000 UTC" firstStartedPulling="2025-10-07 13:20:08.679760198 +0000 UTC m=+780.165469363" lastFinishedPulling="2025-10-07 13:20:17.36058813 +0000 UTC m=+788.846297245" observedRunningTime="2025-10-07 13:20:27.517641077 +0000 UTC m=+799.003350202" watchObservedRunningTime="2025-10-07 13:20:27.519448059 +0000 UTC m=+799.005157184" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.545239 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-1" podStartSLOduration=11.953184068 podStartE2EDuration="21.545220989s" podCreationTimestamp="2025-10-07 13:20:06 +0000 UTC" firstStartedPulling="2025-10-07 13:20:08.423933791 +0000 UTC m=+779.909642916" lastFinishedPulling="2025-10-07 13:20:18.015970722 +0000 UTC m=+789.501679837" observedRunningTime="2025-10-07 13:20:27.537275881 +0000 UTC m=+799.022985046" watchObservedRunningTime="2025-10-07 13:20:27.545220989 +0000 UTC m=+799.030930104" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.559371 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" podStartSLOduration=2.958587793 podStartE2EDuration="3.559352805s" podCreationTimestamp="2025-10-07 13:20:24 +0000 UTC" firstStartedPulling="2025-10-07 13:20:26.47307381 +0000 UTC m=+797.958782935" lastFinishedPulling="2025-10-07 13:20:27.073838832 +0000 UTC m=+798.559547947" observedRunningTime="2025-10-07 13:20:27.557724948 +0000 UTC m=+799.043434073" watchObservedRunningTime="2025-10-07 13:20:27.559352805 +0000 UTC m=+799.045061920" Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.701285 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 07 13:20:27 crc kubenswrapper[4677]: W1007 13:20:27.704019 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8dabf1b_2f0f_4f7f_8342_31001928330b.slice/crio-58362ffda0c2edac31261f1351d114eee6f2f97fb6809d85dc3e390f14c55a78 WatchSource:0}: Error finding container 58362ffda0c2edac31261f1351d114eee6f2f97fb6809d85dc3e390f14c55a78: Status 404 returned error can't find the container with id 58362ffda0c2edac31261f1351d114eee6f2f97fb6809d85dc3e390f14c55a78 Oct 07 13:20:27 crc kubenswrapper[4677]: I1007 13:20:27.902917 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.061900 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cthgg\" (UniqueName: \"kubernetes.io/projected/1a451150-fba9-465e-aab2-5d71965a674d-kube-api-access-cthgg\") pod \"1a451150-fba9-465e-aab2-5d71965a674d\" (UID: \"1a451150-fba9-465e-aab2-5d71965a674d\") " Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.070563 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a451150-fba9-465e-aab2-5d71965a674d-kube-api-access-cthgg" (OuterVolumeSpecName: "kube-api-access-cthgg") pod "1a451150-fba9-465e-aab2-5d71965a674d" (UID: "1a451150-fba9-465e-aab2-5d71965a674d"). InnerVolumeSpecName "kube-api-access-cthgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.121550 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.121599 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.163218 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cthgg\" (UniqueName: \"kubernetes.io/projected/1a451150-fba9-465e-aab2-5d71965a674d-kube-api-access-cthgg\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.175024 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.175072 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.185858 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.185934 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.490288 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"a8dabf1b-2f0f-4f7f-8342-31001928330b","Type":"ContainerStarted","Data":"58362ffda0c2edac31261f1351d114eee6f2f97fb6809d85dc3e390f14c55a78"} Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.492013 4677 generic.go:334] "Generic (PLEG): container finished" podID="1a451150-fba9-465e-aab2-5d71965a674d" containerID="9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6" exitCode=0 Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.492829 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.495575 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" event={"ID":"1a451150-fba9-465e-aab2-5d71965a674d","Type":"ContainerDied","Data":"9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6"} Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.495628 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pt6z2" event={"ID":"1a451150-fba9-465e-aab2-5d71965a674d","Type":"ContainerDied","Data":"1c39a5b4a94c9077a96705488901ace61536d08b049fad387d3ba903bfb53070"} Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.495647 4677 scope.go:117] "RemoveContainer" containerID="9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.522882 4677 scope.go:117] "RemoveContainer" containerID="9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6" Oct 07 13:20:28 crc kubenswrapper[4677]: E1007 13:20:28.523397 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6\": container with ID starting with 9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6 not found: ID does not exist" containerID="9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.523453 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6"} err="failed to get container status \"9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6\": rpc error: code = NotFound desc = could not find container \"9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6\": container with ID starting with 9410fa0e670576225ad8695dce7e2512f27abe30adcecdb3dec7621a3408a4e6 not found: ID does not exist" Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.525699 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pt6z2"] Oct 07 13:20:28 crc kubenswrapper[4677]: I1007 13:20:28.529746 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pt6z2"] Oct 07 13:20:29 crc kubenswrapper[4677]: I1007 13:20:29.334390 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a451150-fba9-465e-aab2-5d71965a674d" path="/var/lib/kubelet/pods/1a451150-fba9-465e-aab2-5d71965a674d/volumes" Oct 07 13:20:30 crc kubenswrapper[4677]: I1007 13:20:30.509756 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"a8dabf1b-2f0f-4f7f-8342-31001928330b","Type":"ContainerStarted","Data":"c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7"} Oct 07 13:20:30 crc kubenswrapper[4677]: I1007 13:20:30.510222 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:30 crc kubenswrapper[4677]: I1007 13:20:30.534030 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/memcached-0" podStartSLOduration=2.175979151 podStartE2EDuration="4.534014079s" podCreationTimestamp="2025-10-07 13:20:26 +0000 UTC" firstStartedPulling="2025-10-07 13:20:27.707082877 +0000 UTC m=+799.192791992" lastFinishedPulling="2025-10-07 13:20:30.065117795 +0000 UTC m=+801.550826920" observedRunningTime="2025-10-07 13:20:30.532001421 +0000 UTC m=+802.017710526" watchObservedRunningTime="2025-10-07 13:20:30.534014079 +0000 UTC m=+802.019723194" Oct 07 13:20:34 crc kubenswrapper[4677]: I1007 13:20:34.269667 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:34 crc kubenswrapper[4677]: I1007 13:20:34.338926 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:20:34 crc kubenswrapper[4677]: I1007 13:20:34.694961 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:34 crc kubenswrapper[4677]: I1007 13:20:34.695288 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:34 crc kubenswrapper[4677]: E1007 13:20:34.703520 4677 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:50842->38.102.83.65:46729: write tcp 38.102.83.65:50842->38.102.83.65:46729: write: broken pipe Oct 07 13:20:34 crc kubenswrapper[4677]: I1007 13:20:34.736727 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:35 crc kubenswrapper[4677]: I1007 13:20:35.573818 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:20:37 crc kubenswrapper[4677]: I1007 13:20:37.259565 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/memcached-0" Oct 07 13:20:38 crc kubenswrapper[4677]: I1007 13:20:38.244884 4677 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerName="galera" probeResult="failure" output=< Oct 07 13:20:38 crc kubenswrapper[4677]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Oct 07 13:20:38 crc kubenswrapper[4677]: > Oct 07 13:20:40 crc kubenswrapper[4677]: I1007 13:20:40.917412 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:20:40 crc kubenswrapper[4677]: I1007 13:20:40.917950 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:20:43 crc kubenswrapper[4677]: I1007 13:20:43.039898 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:43 crc kubenswrapper[4677]: I1007 13:20:43.118154 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.228927 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp"] Oct 07 13:20:44 crc kubenswrapper[4677]: E1007 13:20:44.229548 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a451150-fba9-465e-aab2-5d71965a674d" containerName="registry-server" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.229564 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a451150-fba9-465e-aab2-5d71965a674d" containerName="registry-server" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.229695 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a451150-fba9-465e-aab2-5d71965a674d" containerName="registry-server" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.230785 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.233681 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-krhf2" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.244920 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp"] Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.316817 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fckhx\" (UniqueName: \"kubernetes.io/projected/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-kube-api-access-fckhx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.317178 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.317383 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.418830 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.419353 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.418891 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fckhx\" (UniqueName: \"kubernetes.io/projected/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-kube-api-access-fckhx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.419668 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.423731 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.443583 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fckhx\" (UniqueName: \"kubernetes.io/projected/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-kube-api-access-fckhx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:44 crc kubenswrapper[4677]: I1007 13:20:44.563127 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:45 crc kubenswrapper[4677]: I1007 13:20:45.081978 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp"] Oct 07 13:20:45 crc kubenswrapper[4677]: I1007 13:20:45.646085 4677 generic.go:334] "Generic (PLEG): container finished" podID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerID="b19eb91cb9b2f2b118237689ff5b2659609146e9eece63852675b5958912403e" exitCode=0 Oct 07 13:20:45 crc kubenswrapper[4677]: I1007 13:20:45.646547 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" event={"ID":"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae","Type":"ContainerDied","Data":"b19eb91cb9b2f2b118237689ff5b2659609146e9eece63852675b5958912403e"} Oct 07 13:20:45 crc kubenswrapper[4677]: I1007 13:20:45.646596 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" event={"ID":"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae","Type":"ContainerStarted","Data":"95e8eb0a43c08a58036b45095ae586450cf93898c4a21efe176652ab9560202f"} Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.594309 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v5gpv"] Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.596870 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.616688 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5gpv"] Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.756220 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-catalog-content\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.756270 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d58jg\" (UniqueName: \"kubernetes.io/projected/7a56caa6-d258-4915-beeb-668e3a3a0781-kube-api-access-d58jg\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.756312 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-utilities\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.857356 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-utilities\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.857478 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-catalog-content\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.857511 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d58jg\" (UniqueName: \"kubernetes.io/projected/7a56caa6-d258-4915-beeb-668e3a3a0781-kube-api-access-d58jg\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.858135 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-catalog-content\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.858381 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-utilities\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.875313 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d58jg\" (UniqueName: \"kubernetes.io/projected/7a56caa6-d258-4915-beeb-668e3a3a0781-kube-api-access-d58jg\") pod \"community-operators-v5gpv\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:46 crc kubenswrapper[4677]: I1007 13:20:46.921166 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:47 crc kubenswrapper[4677]: I1007 13:20:47.187625 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5gpv"] Oct 07 13:20:47 crc kubenswrapper[4677]: W1007 13:20:47.229085 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a56caa6_d258_4915_beeb_668e3a3a0781.slice/crio-6afe690d3da1d44f5f34dacec764f5923f7b8577a5aff1436464814ebf95aad8 WatchSource:0}: Error finding container 6afe690d3da1d44f5f34dacec764f5923f7b8577a5aff1436464814ebf95aad8: Status 404 returned error can't find the container with id 6afe690d3da1d44f5f34dacec764f5923f7b8577a5aff1436464814ebf95aad8 Oct 07 13:20:47 crc kubenswrapper[4677]: I1007 13:20:47.663110 4677 generic.go:334] "Generic (PLEG): container finished" podID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerID="62948a1df6f451e9b5f0a3905005445687086c3d089be699334100d47f91c77f" exitCode=0 Oct 07 13:20:47 crc kubenswrapper[4677]: I1007 13:20:47.663268 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gpv" event={"ID":"7a56caa6-d258-4915-beeb-668e3a3a0781","Type":"ContainerDied","Data":"62948a1df6f451e9b5f0a3905005445687086c3d089be699334100d47f91c77f"} Oct 07 13:20:47 crc kubenswrapper[4677]: I1007 13:20:47.663569 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gpv" event={"ID":"7a56caa6-d258-4915-beeb-668e3a3a0781","Type":"ContainerStarted","Data":"6afe690d3da1d44f5f34dacec764f5923f7b8577a5aff1436464814ebf95aad8"} Oct 07 13:20:47 crc kubenswrapper[4677]: I1007 13:20:47.666274 4677 generic.go:334] "Generic (PLEG): container finished" podID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerID="1dc3528812cc007c9d3af3b1e6991eeb10034ed58c9309bfe69e7109e5f0a843" exitCode=0 Oct 07 13:20:47 crc kubenswrapper[4677]: I1007 13:20:47.666313 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" event={"ID":"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae","Type":"ContainerDied","Data":"1dc3528812cc007c9d3af3b1e6991eeb10034ed58c9309bfe69e7109e5f0a843"} Oct 07 13:20:48 crc kubenswrapper[4677]: I1007 13:20:48.700392 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gpv" event={"ID":"7a56caa6-d258-4915-beeb-668e3a3a0781","Type":"ContainerStarted","Data":"275da947100a8cce55bb1949e3ab3ab83acd4955425abea6a55cc73909763b6d"} Oct 07 13:20:48 crc kubenswrapper[4677]: I1007 13:20:48.711822 4677 generic.go:334] "Generic (PLEG): container finished" podID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerID="a08ecfd8070995620f0b3356874651c81838f936e76b44ef22e5ac692f292f00" exitCode=0 Oct 07 13:20:48 crc kubenswrapper[4677]: I1007 13:20:48.711898 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" event={"ID":"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae","Type":"ContainerDied","Data":"a08ecfd8070995620f0b3356874651c81838f936e76b44ef22e5ac692f292f00"} Oct 07 13:20:48 crc kubenswrapper[4677]: I1007 13:20:48.790575 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:48 crc kubenswrapper[4677]: I1007 13:20:48.840420 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:20:49 crc kubenswrapper[4677]: I1007 13:20:49.723174 4677 generic.go:334] "Generic (PLEG): container finished" podID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerID="275da947100a8cce55bb1949e3ab3ab83acd4955425abea6a55cc73909763b6d" exitCode=0 Oct 07 13:20:49 crc kubenswrapper[4677]: I1007 13:20:49.723266 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gpv" event={"ID":"7a56caa6-d258-4915-beeb-668e3a3a0781","Type":"ContainerDied","Data":"275da947100a8cce55bb1949e3ab3ab83acd4955425abea6a55cc73909763b6d"} Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.057865 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.114296 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-bundle\") pod \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.114350 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-util\") pod \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.115536 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-bundle" (OuterVolumeSpecName: "bundle") pod "ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" (UID: "ba6a4a8e-02b4-4ae3-8e66-77416eb060ae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.115566 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fckhx\" (UniqueName: \"kubernetes.io/projected/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-kube-api-access-fckhx\") pod \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\" (UID: \"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae\") " Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.115899 4677 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.122991 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-kube-api-access-fckhx" (OuterVolumeSpecName: "kube-api-access-fckhx") pod "ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" (UID: "ba6a4a8e-02b4-4ae3-8e66-77416eb060ae"). InnerVolumeSpecName "kube-api-access-fckhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.127385 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-util" (OuterVolumeSpecName: "util") pod "ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" (UID: "ba6a4a8e-02b4-4ae3-8e66-77416eb060ae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.217666 4677 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.217701 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fckhx\" (UniqueName: \"kubernetes.io/projected/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae-kube-api-access-fckhx\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.732004 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" event={"ID":"ba6a4a8e-02b4-4ae3-8e66-77416eb060ae","Type":"ContainerDied","Data":"95e8eb0a43c08a58036b45095ae586450cf93898c4a21efe176652ab9560202f"} Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.732329 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95e8eb0a43c08a58036b45095ae586450cf93898c4a21efe176652ab9560202f" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.732067 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp" Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.734788 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gpv" event={"ID":"7a56caa6-d258-4915-beeb-668e3a3a0781","Type":"ContainerStarted","Data":"ac58f238db7fcb1323eb96647dd4ec129bfccc860ac1a1edd989132dbe9b4436"} Oct 07 13:20:50 crc kubenswrapper[4677]: I1007 13:20:50.764661 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v5gpv" podStartSLOduration=2.303614551 podStartE2EDuration="4.764635955s" podCreationTimestamp="2025-10-07 13:20:46 +0000 UTC" firstStartedPulling="2025-10-07 13:20:47.668002427 +0000 UTC m=+819.153711582" lastFinishedPulling="2025-10-07 13:20:50.129023871 +0000 UTC m=+821.614732986" observedRunningTime="2025-10-07 13:20:50.760228258 +0000 UTC m=+822.245937433" watchObservedRunningTime="2025-10-07 13:20:50.764635955 +0000 UTC m=+822.250345080" Oct 07 13:20:56 crc kubenswrapper[4677]: I1007 13:20:56.921771 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:56 crc kubenswrapper[4677]: I1007 13:20:56.922326 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:56 crc kubenswrapper[4677]: I1007 13:20:56.967489 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:57 crc kubenswrapper[4677]: I1007 13:20:57.838511 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.264696 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc"] Oct 07 13:20:59 crc kubenswrapper[4677]: E1007 13:20:59.265059 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerName="util" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.265080 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerName="util" Oct 07 13:20:59 crc kubenswrapper[4677]: E1007 13:20:59.265099 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerName="pull" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.265112 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerName="pull" Oct 07 13:20:59 crc kubenswrapper[4677]: E1007 13:20:59.265154 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerName="extract" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.265168 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerName="extract" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.265359 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" containerName="extract" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.266118 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.270959 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-wjf7w" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.279181 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc"] Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.352265 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbn5x\" (UniqueName: \"kubernetes.io/projected/00ec0b84-feea-4d16-b2f7-8935555cee0d-kube-api-access-rbn5x\") pod \"rabbitmq-cluster-operator-779fc9694b-8z6bc\" (UID: \"00ec0b84-feea-4d16-b2f7-8935555cee0d\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.453929 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbn5x\" (UniqueName: \"kubernetes.io/projected/00ec0b84-feea-4d16-b2f7-8935555cee0d-kube-api-access-rbn5x\") pod \"rabbitmq-cluster-operator-779fc9694b-8z6bc\" (UID: \"00ec0b84-feea-4d16-b2f7-8935555cee0d\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.475514 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbn5x\" (UniqueName: \"kubernetes.io/projected/00ec0b84-feea-4d16-b2f7-8935555cee0d-kube-api-access-rbn5x\") pod \"rabbitmq-cluster-operator-779fc9694b-8z6bc\" (UID: \"00ec0b84-feea-4d16-b2f7-8935555cee0d\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" Oct 07 13:20:59 crc kubenswrapper[4677]: I1007 13:20:59.585355 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" Oct 07 13:21:00 crc kubenswrapper[4677]: I1007 13:21:00.017897 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc"] Oct 07 13:21:00 crc kubenswrapper[4677]: I1007 13:21:00.581915 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5gpv"] Oct 07 13:21:00 crc kubenswrapper[4677]: I1007 13:21:00.582177 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v5gpv" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="registry-server" containerID="cri-o://ac58f238db7fcb1323eb96647dd4ec129bfccc860ac1a1edd989132dbe9b4436" gracePeriod=2 Oct 07 13:21:00 crc kubenswrapper[4677]: I1007 13:21:00.803108 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" event={"ID":"00ec0b84-feea-4d16-b2f7-8935555cee0d","Type":"ContainerStarted","Data":"c6d96d1afc4affe6ddbbb0d229ea04c07a51a1b840559889fac8391358f80ed7"} Oct 07 13:21:00 crc kubenswrapper[4677]: I1007 13:21:00.805598 4677 generic.go:334] "Generic (PLEG): container finished" podID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerID="ac58f238db7fcb1323eb96647dd4ec129bfccc860ac1a1edd989132dbe9b4436" exitCode=0 Oct 07 13:21:00 crc kubenswrapper[4677]: I1007 13:21:00.805628 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gpv" event={"ID":"7a56caa6-d258-4915-beeb-668e3a3a0781","Type":"ContainerDied","Data":"ac58f238db7fcb1323eb96647dd4ec129bfccc860ac1a1edd989132dbe9b4436"} Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.549346 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.711521 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-utilities\") pod \"7a56caa6-d258-4915-beeb-668e3a3a0781\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.711992 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-catalog-content\") pod \"7a56caa6-d258-4915-beeb-668e3a3a0781\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.712037 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d58jg\" (UniqueName: \"kubernetes.io/projected/7a56caa6-d258-4915-beeb-668e3a3a0781-kube-api-access-d58jg\") pod \"7a56caa6-d258-4915-beeb-668e3a3a0781\" (UID: \"7a56caa6-d258-4915-beeb-668e3a3a0781\") " Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.712851 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-utilities" (OuterVolumeSpecName: "utilities") pod "7a56caa6-d258-4915-beeb-668e3a3a0781" (UID: "7a56caa6-d258-4915-beeb-668e3a3a0781"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.720216 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a56caa6-d258-4915-beeb-668e3a3a0781-kube-api-access-d58jg" (OuterVolumeSpecName: "kube-api-access-d58jg") pod "7a56caa6-d258-4915-beeb-668e3a3a0781" (UID: "7a56caa6-d258-4915-beeb-668e3a3a0781"). InnerVolumeSpecName "kube-api-access-d58jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.768797 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a56caa6-d258-4915-beeb-668e3a3a0781" (UID: "7a56caa6-d258-4915-beeb-668e3a3a0781"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.813058 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.813089 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56caa6-d258-4915-beeb-668e3a3a0781-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.813100 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d58jg\" (UniqueName: \"kubernetes.io/projected/7a56caa6-d258-4915-beeb-668e3a3a0781-kube-api-access-d58jg\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.814780 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gpv" event={"ID":"7a56caa6-d258-4915-beeb-668e3a3a0781","Type":"ContainerDied","Data":"6afe690d3da1d44f5f34dacec764f5923f7b8577a5aff1436464814ebf95aad8"} Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.814837 4677 scope.go:117] "RemoveContainer" containerID="ac58f238db7fcb1323eb96647dd4ec129bfccc860ac1a1edd989132dbe9b4436" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.814961 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gpv" Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.846572 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5gpv"] Oct 07 13:21:01 crc kubenswrapper[4677]: I1007 13:21:01.850033 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v5gpv"] Oct 07 13:21:02 crc kubenswrapper[4677]: I1007 13:21:02.222805 4677 scope.go:117] "RemoveContainer" containerID="275da947100a8cce55bb1949e3ab3ab83acd4955425abea6a55cc73909763b6d" Oct 07 13:21:02 crc kubenswrapper[4677]: I1007 13:21:02.257710 4677 scope.go:117] "RemoveContainer" containerID="62948a1df6f451e9b5f0a3905005445687086c3d089be699334100d47f91c77f" Oct 07 13:21:02 crc kubenswrapper[4677]: I1007 13:21:02.827021 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" event={"ID":"00ec0b84-feea-4d16-b2f7-8935555cee0d","Type":"ContainerStarted","Data":"27568c198345ee4c91033ba5248dec9e2d2b3334c12a0d195de3af20cb358955"} Oct 07 13:21:02 crc kubenswrapper[4677]: I1007 13:21:02.848766 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" podStartSLOduration=1.6160798920000001 podStartE2EDuration="3.848746s" podCreationTimestamp="2025-10-07 13:20:59 +0000 UTC" firstStartedPulling="2025-10-07 13:21:00.029651253 +0000 UTC m=+831.515360408" lastFinishedPulling="2025-10-07 13:21:02.262317401 +0000 UTC m=+833.748026516" observedRunningTime="2025-10-07 13:21:02.847790373 +0000 UTC m=+834.333499488" watchObservedRunningTime="2025-10-07 13:21:02.848746 +0000 UTC m=+834.334455115" Oct 07 13:21:03 crc kubenswrapper[4677]: I1007 13:21:03.313624 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" path="/var/lib/kubelet/pods/7a56caa6-d258-4915-beeb-668e3a3a0781/volumes" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.391532 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7x7gc"] Oct 07 13:21:06 crc kubenswrapper[4677]: E1007 13:21:06.392370 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="registry-server" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.392402 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="registry-server" Oct 07 13:21:06 crc kubenswrapper[4677]: E1007 13:21:06.392532 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="extract-utilities" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.392558 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="extract-utilities" Oct 07 13:21:06 crc kubenswrapper[4677]: E1007 13:21:06.392599 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="extract-content" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.392616 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="extract-content" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.392846 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a56caa6-d258-4915-beeb-668e3a3a0781" containerName="registry-server" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.394603 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.417051 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7x7gc"] Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.480531 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-utilities\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.480619 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg8t8\" (UniqueName: \"kubernetes.io/projected/9e4964a6-8002-49e2-bf7b-5aa664061eaa-kube-api-access-lg8t8\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.480650 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-catalog-content\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.582094 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-utilities\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.582170 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg8t8\" (UniqueName: \"kubernetes.io/projected/9e4964a6-8002-49e2-bf7b-5aa664061eaa-kube-api-access-lg8t8\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.582196 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-catalog-content\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.582662 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-catalog-content\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.582974 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-utilities\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.601283 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg8t8\" (UniqueName: \"kubernetes.io/projected/9e4964a6-8002-49e2-bf7b-5aa664061eaa-kube-api-access-lg8t8\") pod \"certified-operators-7x7gc\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:06 crc kubenswrapper[4677]: I1007 13:21:06.712256 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:07 crc kubenswrapper[4677]: I1007 13:21:07.143932 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7x7gc"] Oct 07 13:21:07 crc kubenswrapper[4677]: W1007 13:21:07.147366 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4964a6_8002_49e2_bf7b_5aa664061eaa.slice/crio-6857aa3df9b12e50c063f3c48769252a67b64375587fda339d8508187dd90b60 WatchSource:0}: Error finding container 6857aa3df9b12e50c063f3c48769252a67b64375587fda339d8508187dd90b60: Status 404 returned error can't find the container with id 6857aa3df9b12e50c063f3c48769252a67b64375587fda339d8508187dd90b60 Oct 07 13:21:07 crc kubenswrapper[4677]: I1007 13:21:07.863405 4677 generic.go:334] "Generic (PLEG): container finished" podID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerID="da00ac21993ccb8322d1f026245750ac21012edd49232ba4aa6eac354062418f" exitCode=0 Oct 07 13:21:07 crc kubenswrapper[4677]: I1007 13:21:07.863510 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7x7gc" event={"ID":"9e4964a6-8002-49e2-bf7b-5aa664061eaa","Type":"ContainerDied","Data":"da00ac21993ccb8322d1f026245750ac21012edd49232ba4aa6eac354062418f"} Oct 07 13:21:07 crc kubenswrapper[4677]: I1007 13:21:07.863794 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7x7gc" event={"ID":"9e4964a6-8002-49e2-bf7b-5aa664061eaa","Type":"ContainerStarted","Data":"6857aa3df9b12e50c063f3c48769252a67b64375587fda339d8508187dd90b60"} Oct 07 13:21:08 crc kubenswrapper[4677]: I1007 13:21:08.873754 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7x7gc" event={"ID":"9e4964a6-8002-49e2-bf7b-5aa664061eaa","Type":"ContainerStarted","Data":"7f2608be74f0930bb1529640efa2e77d1eb4cb5022a8a860ff82a1133a1a5668"} Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.185208 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.186390 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.188391 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.188730 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-hlxvn" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.189027 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.189224 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.191187 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.207479 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.320903 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdrlz\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-kube-api-access-tdrlz\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.320961 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.320992 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c50e7113-f37a-4ea1-9d53-c53106564a48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.321049 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.321085 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.321114 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c50e7113-f37a-4ea1-9d53-c53106564a48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.321159 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.321199 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c50e7113-f37a-4ea1-9d53-c53106564a48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423037 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c50e7113-f37a-4ea1-9d53-c53106564a48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423162 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdrlz\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-kube-api-access-tdrlz\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423218 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423250 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c50e7113-f37a-4ea1-9d53-c53106564a48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423345 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423455 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423550 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c50e7113-f37a-4ea1-9d53-c53106564a48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.423665 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.424015 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.424565 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.426093 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.426093 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.427006 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.427942 4677 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.427972 4677 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8e88eb450ea9aafb5a32d43af874cb1d32cfa714e50a6f60cacf0b637080a9f9/globalmount\"" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.431150 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c50e7113-f37a-4ea1-9d53-c53106564a48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.434560 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c50e7113-f37a-4ea1-9d53-c53106564a48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.439015 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.439233 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c50e7113-f37a-4ea1-9d53-c53106564a48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.440943 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdrlz\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-kube-api-access-tdrlz\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.447574 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.450061 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\") pod \"rabbitmq-server-0\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.518042 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-hlxvn" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.526531 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.895601 4677 generic.go:334] "Generic (PLEG): container finished" podID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerID="7f2608be74f0930bb1529640efa2e77d1eb4cb5022a8a860ff82a1133a1a5668" exitCode=0 Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.895655 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7x7gc" event={"ID":"9e4964a6-8002-49e2-bf7b-5aa664061eaa","Type":"ContainerDied","Data":"7f2608be74f0930bb1529640efa2e77d1eb4cb5022a8a860ff82a1133a1a5668"} Oct 07 13:21:09 crc kubenswrapper[4677]: I1007 13:21:09.942420 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.904736 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"c50e7113-f37a-4ea1-9d53-c53106564a48","Type":"ContainerStarted","Data":"b8337d16321f5fa19f2ed880701bf97c952fffc2dc4005f8c06cdfabae088f16"} Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.907192 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7x7gc" event={"ID":"9e4964a6-8002-49e2-bf7b-5aa664061eaa","Type":"ContainerStarted","Data":"9c58cd0314c1f65472fe0d5a6fa5a52112bf84b6ca1ca0466f67dddac49811ec"} Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.917644 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.917704 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.917756 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.918348 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec42d23040a8452012f48d89f3054555831bcfb79cefb8a91a87385178f388c8"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.918464 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://ec42d23040a8452012f48d89f3054555831bcfb79cefb8a91a87385178f388c8" gracePeriod=600 Oct 07 13:21:10 crc kubenswrapper[4677]: I1007 13:21:10.926403 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7x7gc" podStartSLOduration=2.312033793 podStartE2EDuration="4.926380011s" podCreationTimestamp="2025-10-07 13:21:06 +0000 UTC" firstStartedPulling="2025-10-07 13:21:07.865798468 +0000 UTC m=+839.351507583" lastFinishedPulling="2025-10-07 13:21:10.480144676 +0000 UTC m=+841.965853801" observedRunningTime="2025-10-07 13:21:10.926235487 +0000 UTC m=+842.411944622" watchObservedRunningTime="2025-10-07 13:21:10.926380011 +0000 UTC m=+842.412089156" Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.780160 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-dptwb"] Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.793929 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-dptwb"] Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.794061 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.797491 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-bkfck" Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.860690 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwxqf\" (UniqueName: \"kubernetes.io/projected/869e5c88-2679-4b90-b674-bba233dc88e0-kube-api-access-bwxqf\") pod \"keystone-operator-index-dptwb\" (UID: \"869e5c88-2679-4b90-b674-bba233dc88e0\") " pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.916356 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="ec42d23040a8452012f48d89f3054555831bcfb79cefb8a91a87385178f388c8" exitCode=0 Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.916636 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"ec42d23040a8452012f48d89f3054555831bcfb79cefb8a91a87385178f388c8"} Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.916735 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"37fd49a51a3bd5137d45d074e28b4ab8e0800f2fea4c41dcc155e9985c92e63a"} Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.916759 4677 scope.go:117] "RemoveContainer" containerID="75d4db8c22e96ea7fcbf447dc088ac317cb51f5a548c1df77f076e3a1152231a" Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.961991 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwxqf\" (UniqueName: \"kubernetes.io/projected/869e5c88-2679-4b90-b674-bba233dc88e0-kube-api-access-bwxqf\") pod \"keystone-operator-index-dptwb\" (UID: \"869e5c88-2679-4b90-b674-bba233dc88e0\") " pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:11 crc kubenswrapper[4677]: I1007 13:21:11.993802 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwxqf\" (UniqueName: \"kubernetes.io/projected/869e5c88-2679-4b90-b674-bba233dc88e0-kube-api-access-bwxqf\") pod \"keystone-operator-index-dptwb\" (UID: \"869e5c88-2679-4b90-b674-bba233dc88e0\") " pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:12 crc kubenswrapper[4677]: I1007 13:21:12.127484 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:14 crc kubenswrapper[4677]: I1007 13:21:14.986298 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7js5b"] Oct 07 13:21:14 crc kubenswrapper[4677]: I1007 13:21:14.988950 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.000585 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7js5b"] Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.111914 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-utilities\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.112122 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t548t\" (UniqueName: \"kubernetes.io/projected/687f0d4d-64c0-4d3f-b50f-899993b96168-kube-api-access-t548t\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.112531 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-catalog-content\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.214009 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-utilities\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.214338 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t548t\" (UniqueName: \"kubernetes.io/projected/687f0d4d-64c0-4d3f-b50f-899993b96168-kube-api-access-t548t\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.214457 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-catalog-content\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.214990 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-catalog-content\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.215233 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-utilities\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.249263 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t548t\" (UniqueName: \"kubernetes.io/projected/687f0d4d-64c0-4d3f-b50f-899993b96168-kube-api-access-t548t\") pod \"redhat-operators-7js5b\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.322730 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:15 crc kubenswrapper[4677]: I1007 13:21:15.768661 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-dptwb"] Oct 07 13:21:16 crc kubenswrapper[4677]: W1007 13:21:16.083169 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869e5c88_2679_4b90_b674_bba233dc88e0.slice/crio-a6a12022e0bc5bea56657c9e9d7efbe0461519c6663e2a1c68e13bce0e63dee8 WatchSource:0}: Error finding container a6a12022e0bc5bea56657c9e9d7efbe0461519c6663e2a1c68e13bce0e63dee8: Status 404 returned error can't find the container with id a6a12022e0bc5bea56657c9e9d7efbe0461519c6663e2a1c68e13bce0e63dee8 Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.334807 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7js5b"] Oct 07 13:21:16 crc kubenswrapper[4677]: W1007 13:21:16.344708 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687f0d4d_64c0_4d3f_b50f_899993b96168.slice/crio-b17658690959e740afd4664a89262845d9cedf118a2e6f28e6b5e532bb42c7d4 WatchSource:0}: Error finding container b17658690959e740afd4664a89262845d9cedf118a2e6f28e6b5e532bb42c7d4: Status 404 returned error can't find the container with id b17658690959e740afd4664a89262845d9cedf118a2e6f28e6b5e532bb42c7d4 Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.712569 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.712982 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.755967 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.952718 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dptwb" event={"ID":"869e5c88-2679-4b90-b674-bba233dc88e0","Type":"ContainerStarted","Data":"a6a12022e0bc5bea56657c9e9d7efbe0461519c6663e2a1c68e13bce0e63dee8"} Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.953961 4677 generic.go:334] "Generic (PLEG): container finished" podID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerID="5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd" exitCode=0 Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.954028 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7js5b" event={"ID":"687f0d4d-64c0-4d3f-b50f-899993b96168","Type":"ContainerDied","Data":"5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd"} Oct 07 13:21:16 crc kubenswrapper[4677]: I1007 13:21:16.954043 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7js5b" event={"ID":"687f0d4d-64c0-4d3f-b50f-899993b96168","Type":"ContainerStarted","Data":"b17658690959e740afd4664a89262845d9cedf118a2e6f28e6b5e532bb42c7d4"} Oct 07 13:21:17 crc kubenswrapper[4677]: I1007 13:21:17.011793 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:17 crc kubenswrapper[4677]: I1007 13:21:17.965991 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"c50e7113-f37a-4ea1-9d53-c53106564a48","Type":"ContainerStarted","Data":"4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11"} Oct 07 13:21:17 crc kubenswrapper[4677]: I1007 13:21:17.968204 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dptwb" event={"ID":"869e5c88-2679-4b90-b674-bba233dc88e0","Type":"ContainerStarted","Data":"cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b"} Oct 07 13:21:18 crc kubenswrapper[4677]: I1007 13:21:18.013186 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-dptwb" podStartSLOduration=5.559261993 podStartE2EDuration="7.013164185s" podCreationTimestamp="2025-10-07 13:21:11 +0000 UTC" firstStartedPulling="2025-10-07 13:21:16.090173162 +0000 UTC m=+847.575882277" lastFinishedPulling="2025-10-07 13:21:17.544075354 +0000 UTC m=+849.029784469" observedRunningTime="2025-10-07 13:21:18.00985489 +0000 UTC m=+849.495564015" watchObservedRunningTime="2025-10-07 13:21:18.013164185 +0000 UTC m=+849.498873310" Oct 07 13:21:18 crc kubenswrapper[4677]: I1007 13:21:18.987472 4677 generic.go:334] "Generic (PLEG): container finished" podID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerID="00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed" exitCode=0 Oct 07 13:21:18 crc kubenswrapper[4677]: I1007 13:21:18.987547 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7js5b" event={"ID":"687f0d4d-64c0-4d3f-b50f-899993b96168","Type":"ContainerDied","Data":"00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed"} Oct 07 13:21:21 crc kubenswrapper[4677]: I1007 13:21:21.003249 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7js5b" event={"ID":"687f0d4d-64c0-4d3f-b50f-899993b96168","Type":"ContainerStarted","Data":"73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789"} Oct 07 13:21:21 crc kubenswrapper[4677]: I1007 13:21:21.773590 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7js5b" podStartSLOduration=5.536265565 podStartE2EDuration="7.773572491s" podCreationTimestamp="2025-10-07 13:21:14 +0000 UTC" firstStartedPulling="2025-10-07 13:21:17.43710611 +0000 UTC m=+848.922815225" lastFinishedPulling="2025-10-07 13:21:19.674412996 +0000 UTC m=+851.160122151" observedRunningTime="2025-10-07 13:21:21.039784114 +0000 UTC m=+852.525493219" watchObservedRunningTime="2025-10-07 13:21:21.773572491 +0000 UTC m=+853.259281606" Oct 07 13:21:21 crc kubenswrapper[4677]: I1007 13:21:21.775464 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7x7gc"] Oct 07 13:21:21 crc kubenswrapper[4677]: I1007 13:21:21.775678 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7x7gc" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="registry-server" containerID="cri-o://9c58cd0314c1f65472fe0d5a6fa5a52112bf84b6ca1ca0466f67dddac49811ec" gracePeriod=2 Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.013178 4677 generic.go:334] "Generic (PLEG): container finished" podID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerID="9c58cd0314c1f65472fe0d5a6fa5a52112bf84b6ca1ca0466f67dddac49811ec" exitCode=0 Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.013880 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7x7gc" event={"ID":"9e4964a6-8002-49e2-bf7b-5aa664061eaa","Type":"ContainerDied","Data":"9c58cd0314c1f65472fe0d5a6fa5a52112bf84b6ca1ca0466f67dddac49811ec"} Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.128469 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.128519 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.158972 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.204412 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.323299 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg8t8\" (UniqueName: \"kubernetes.io/projected/9e4964a6-8002-49e2-bf7b-5aa664061eaa-kube-api-access-lg8t8\") pod \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.323378 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-utilities\") pod \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.323522 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-catalog-content\") pod \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\" (UID: \"9e4964a6-8002-49e2-bf7b-5aa664061eaa\") " Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.324178 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-utilities" (OuterVolumeSpecName: "utilities") pod "9e4964a6-8002-49e2-bf7b-5aa664061eaa" (UID: "9e4964a6-8002-49e2-bf7b-5aa664061eaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.331580 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4964a6-8002-49e2-bf7b-5aa664061eaa-kube-api-access-lg8t8" (OuterVolumeSpecName: "kube-api-access-lg8t8") pod "9e4964a6-8002-49e2-bf7b-5aa664061eaa" (UID: "9e4964a6-8002-49e2-bf7b-5aa664061eaa"). InnerVolumeSpecName "kube-api-access-lg8t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.371471 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e4964a6-8002-49e2-bf7b-5aa664061eaa" (UID: "9e4964a6-8002-49e2-bf7b-5aa664061eaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.425328 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.425369 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg8t8\" (UniqueName: \"kubernetes.io/projected/9e4964a6-8002-49e2-bf7b-5aa664061eaa-kube-api-access-lg8t8\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:22 crc kubenswrapper[4677]: I1007 13:21:22.425386 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e4964a6-8002-49e2-bf7b-5aa664061eaa-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.025627 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7x7gc" event={"ID":"9e4964a6-8002-49e2-bf7b-5aa664061eaa","Type":"ContainerDied","Data":"6857aa3df9b12e50c063f3c48769252a67b64375587fda339d8508187dd90b60"} Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.025712 4677 scope.go:117] "RemoveContainer" containerID="9c58cd0314c1f65472fe0d5a6fa5a52112bf84b6ca1ca0466f67dddac49811ec" Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.025648 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7x7gc" Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.052418 4677 scope.go:117] "RemoveContainer" containerID="7f2608be74f0930bb1529640efa2e77d1eb4cb5022a8a860ff82a1133a1a5668" Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.074409 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7x7gc"] Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.088213 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.092258 4677 scope.go:117] "RemoveContainer" containerID="da00ac21993ccb8322d1f026245750ac21012edd49232ba4aa6eac354062418f" Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.093242 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7x7gc"] Oct 07 13:21:23 crc kubenswrapper[4677]: I1007 13:21:23.325615 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" path="/var/lib/kubelet/pods/9e4964a6-8002-49e2-bf7b-5aa664061eaa/volumes" Oct 07 13:21:25 crc kubenswrapper[4677]: I1007 13:21:25.323086 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:25 crc kubenswrapper[4677]: I1007 13:21:25.323496 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:25 crc kubenswrapper[4677]: I1007 13:21:25.387551 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.094672 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.640232 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn"] Oct 07 13:21:26 crc kubenswrapper[4677]: E1007 13:21:26.640522 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="extract-content" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.640536 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="extract-content" Oct 07 13:21:26 crc kubenswrapper[4677]: E1007 13:21:26.640546 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="registry-server" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.640554 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="registry-server" Oct 07 13:21:26 crc kubenswrapper[4677]: E1007 13:21:26.640567 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="extract-utilities" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.640575 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="extract-utilities" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.640731 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e4964a6-8002-49e2-bf7b-5aa664061eaa" containerName="registry-server" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.641785 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.645217 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-krhf2" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.648925 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn"] Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.780740 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-bundle\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.780810 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6r8\" (UniqueName: \"kubernetes.io/projected/f6b65f72-d122-49d5-83dd-06541b985a21-kube-api-access-md6r8\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.780859 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-util\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.882026 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-bundle\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.882096 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md6r8\" (UniqueName: \"kubernetes.io/projected/f6b65f72-d122-49d5-83dd-06541b985a21-kube-api-access-md6r8\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.882153 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-util\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.882549 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-bundle\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.882634 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-util\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:26 crc kubenswrapper[4677]: I1007 13:21:26.901249 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6r8\" (UniqueName: \"kubernetes.io/projected/f6b65f72-d122-49d5-83dd-06541b985a21-kube-api-access-md6r8\") pod \"3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:27 crc kubenswrapper[4677]: I1007 13:21:27.012487 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:28 crc kubenswrapper[4677]: I1007 13:21:28.234975 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn"] Oct 07 13:21:28 crc kubenswrapper[4677]: W1007 13:21:28.245676 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b65f72_d122_49d5_83dd_06541b985a21.slice/crio-1b37ac57d8d2c1081f456ec5021346e092d7fc0f400a1383e2d91ea4aa1b65b0 WatchSource:0}: Error finding container 1b37ac57d8d2c1081f456ec5021346e092d7fc0f400a1383e2d91ea4aa1b65b0: Status 404 returned error can't find the container with id 1b37ac57d8d2c1081f456ec5021346e092d7fc0f400a1383e2d91ea4aa1b65b0 Oct 07 13:21:29 crc kubenswrapper[4677]: I1007 13:21:29.076040 4677 generic.go:334] "Generic (PLEG): container finished" podID="f6b65f72-d122-49d5-83dd-06541b985a21" containerID="c7f1f68e0581e41d01275ddb5a9281dd1cc824611e6f7d5fd460d639b880d032" exitCode=0 Oct 07 13:21:29 crc kubenswrapper[4677]: I1007 13:21:29.076360 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" event={"ID":"f6b65f72-d122-49d5-83dd-06541b985a21","Type":"ContainerDied","Data":"c7f1f68e0581e41d01275ddb5a9281dd1cc824611e6f7d5fd460d639b880d032"} Oct 07 13:21:29 crc kubenswrapper[4677]: I1007 13:21:29.077126 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" event={"ID":"f6b65f72-d122-49d5-83dd-06541b985a21","Type":"ContainerStarted","Data":"1b37ac57d8d2c1081f456ec5021346e092d7fc0f400a1383e2d91ea4aa1b65b0"} Oct 07 13:21:30 crc kubenswrapper[4677]: I1007 13:21:30.088653 4677 generic.go:334] "Generic (PLEG): container finished" podID="f6b65f72-d122-49d5-83dd-06541b985a21" containerID="e322569eb2b950c724b113ea499bc83fef690e7f5beb87e95037d3cfd0bdf326" exitCode=0 Oct 07 13:21:30 crc kubenswrapper[4677]: I1007 13:21:30.088704 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" event={"ID":"f6b65f72-d122-49d5-83dd-06541b985a21","Type":"ContainerDied","Data":"e322569eb2b950c724b113ea499bc83fef690e7f5beb87e95037d3cfd0bdf326"} Oct 07 13:21:30 crc kubenswrapper[4677]: E1007 13:21:30.495659 4677 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b65f72_d122_49d5_83dd_06541b985a21.slice/crio-c190424511409e29680335d531107ec5129222100e8f8ca39e9d77a8db08acf2.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:21:31 crc kubenswrapper[4677]: I1007 13:21:31.100642 4677 generic.go:334] "Generic (PLEG): container finished" podID="f6b65f72-d122-49d5-83dd-06541b985a21" containerID="c190424511409e29680335d531107ec5129222100e8f8ca39e9d77a8db08acf2" exitCode=0 Oct 07 13:21:31 crc kubenswrapper[4677]: I1007 13:21:31.100708 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" event={"ID":"f6b65f72-d122-49d5-83dd-06541b985a21","Type":"ContainerDied","Data":"c190424511409e29680335d531107ec5129222100e8f8ca39e9d77a8db08acf2"} Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.455589 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.555266 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-bundle\") pod \"f6b65f72-d122-49d5-83dd-06541b985a21\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.555315 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-util\") pod \"f6b65f72-d122-49d5-83dd-06541b985a21\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.555374 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md6r8\" (UniqueName: \"kubernetes.io/projected/f6b65f72-d122-49d5-83dd-06541b985a21-kube-api-access-md6r8\") pod \"f6b65f72-d122-49d5-83dd-06541b985a21\" (UID: \"f6b65f72-d122-49d5-83dd-06541b985a21\") " Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.556928 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-bundle" (OuterVolumeSpecName: "bundle") pod "f6b65f72-d122-49d5-83dd-06541b985a21" (UID: "f6b65f72-d122-49d5-83dd-06541b985a21"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.579618 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-util" (OuterVolumeSpecName: "util") pod "f6b65f72-d122-49d5-83dd-06541b985a21" (UID: "f6b65f72-d122-49d5-83dd-06541b985a21"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.587849 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b65f72-d122-49d5-83dd-06541b985a21-kube-api-access-md6r8" (OuterVolumeSpecName: "kube-api-access-md6r8") pod "f6b65f72-d122-49d5-83dd-06541b985a21" (UID: "f6b65f72-d122-49d5-83dd-06541b985a21"). InnerVolumeSpecName "kube-api-access-md6r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.591980 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7js5b"] Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.592837 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7js5b" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="registry-server" containerID="cri-o://73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789" gracePeriod=2 Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.656794 4677 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.657053 4677 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b65f72-d122-49d5-83dd-06541b985a21-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.657067 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md6r8\" (UniqueName: \"kubernetes.io/projected/f6b65f72-d122-49d5-83dd-06541b985a21-kube-api-access-md6r8\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:32 crc kubenswrapper[4677]: I1007 13:21:32.981166 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.062632 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t548t\" (UniqueName: \"kubernetes.io/projected/687f0d4d-64c0-4d3f-b50f-899993b96168-kube-api-access-t548t\") pod \"687f0d4d-64c0-4d3f-b50f-899993b96168\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.062694 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-catalog-content\") pod \"687f0d4d-64c0-4d3f-b50f-899993b96168\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.062814 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-utilities\") pod \"687f0d4d-64c0-4d3f-b50f-899993b96168\" (UID: \"687f0d4d-64c0-4d3f-b50f-899993b96168\") " Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.063594 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-utilities" (OuterVolumeSpecName: "utilities") pod "687f0d4d-64c0-4d3f-b50f-899993b96168" (UID: "687f0d4d-64c0-4d3f-b50f-899993b96168"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.066152 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687f0d4d-64c0-4d3f-b50f-899993b96168-kube-api-access-t548t" (OuterVolumeSpecName: "kube-api-access-t548t") pod "687f0d4d-64c0-4d3f-b50f-899993b96168" (UID: "687f0d4d-64c0-4d3f-b50f-899993b96168"). InnerVolumeSpecName "kube-api-access-t548t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.116991 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.117004 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn" event={"ID":"f6b65f72-d122-49d5-83dd-06541b985a21","Type":"ContainerDied","Data":"1b37ac57d8d2c1081f456ec5021346e092d7fc0f400a1383e2d91ea4aa1b65b0"} Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.117153 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b37ac57d8d2c1081f456ec5021346e092d7fc0f400a1383e2d91ea4aa1b65b0" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.119285 4677 generic.go:334] "Generic (PLEG): container finished" podID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerID="73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789" exitCode=0 Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.119344 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7js5b" event={"ID":"687f0d4d-64c0-4d3f-b50f-899993b96168","Type":"ContainerDied","Data":"73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789"} Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.119376 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7js5b" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.119408 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7js5b" event={"ID":"687f0d4d-64c0-4d3f-b50f-899993b96168","Type":"ContainerDied","Data":"b17658690959e740afd4664a89262845d9cedf118a2e6f28e6b5e532bb42c7d4"} Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.119451 4677 scope.go:117] "RemoveContainer" containerID="73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.138700 4677 scope.go:117] "RemoveContainer" containerID="00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.160618 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "687f0d4d-64c0-4d3f-b50f-899993b96168" (UID: "687f0d4d-64c0-4d3f-b50f-899993b96168"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.160386 4677 scope.go:117] "RemoveContainer" containerID="5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.164715 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.164738 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t548t\" (UniqueName: \"kubernetes.io/projected/687f0d4d-64c0-4d3f-b50f-899993b96168-kube-api-access-t548t\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.164749 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687f0d4d-64c0-4d3f-b50f-899993b96168-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.178454 4677 scope.go:117] "RemoveContainer" containerID="73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789" Oct 07 13:21:33 crc kubenswrapper[4677]: E1007 13:21:33.179021 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789\": container with ID starting with 73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789 not found: ID does not exist" containerID="73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.179068 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789"} err="failed to get container status \"73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789\": rpc error: code = NotFound desc = could not find container \"73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789\": container with ID starting with 73e3a5b48ba5cd3620ee6f1638e1f1707327e6c1e7dfc41e68e7ee947f2f2789 not found: ID does not exist" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.179096 4677 scope.go:117] "RemoveContainer" containerID="00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed" Oct 07 13:21:33 crc kubenswrapper[4677]: E1007 13:21:33.179524 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed\": container with ID starting with 00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed not found: ID does not exist" containerID="00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.179547 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed"} err="failed to get container status \"00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed\": rpc error: code = NotFound desc = could not find container \"00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed\": container with ID starting with 00dc68dfb06cc4033ff6fed01a5ba6a0674847444844e412183f5c79426616ed not found: ID does not exist" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.179561 4677 scope.go:117] "RemoveContainer" containerID="5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd" Oct 07 13:21:33 crc kubenswrapper[4677]: E1007 13:21:33.179817 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd\": container with ID starting with 5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd not found: ID does not exist" containerID="5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.179856 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd"} err="failed to get container status \"5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd\": rpc error: code = NotFound desc = could not find container \"5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd\": container with ID starting with 5391666f91cd8e7bd8d309e61f59813415d7510b34f5a17b52a1495b582fdbdd not found: ID does not exist" Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.435824 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7js5b"] Oct 07 13:21:33 crc kubenswrapper[4677]: I1007 13:21:33.442015 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7js5b"] Oct 07 13:21:35 crc kubenswrapper[4677]: I1007 13:21:35.317219 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" path="/var/lib/kubelet/pods/687f0d4d-64c0-4d3f-b50f-899993b96168/volumes" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.395355 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9k2v9"] Oct 07 13:21:37 crc kubenswrapper[4677]: E1007 13:21:37.396108 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" containerName="util" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396130 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" containerName="util" Oct 07 13:21:37 crc kubenswrapper[4677]: E1007 13:21:37.396152 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" containerName="pull" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396164 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" containerName="pull" Oct 07 13:21:37 crc kubenswrapper[4677]: E1007 13:21:37.396184 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="extract-utilities" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396197 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="extract-utilities" Oct 07 13:21:37 crc kubenswrapper[4677]: E1007 13:21:37.396213 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" containerName="extract" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396225 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" containerName="extract" Oct 07 13:21:37 crc kubenswrapper[4677]: E1007 13:21:37.396254 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="extract-content" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396265 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="extract-content" Oct 07 13:21:37 crc kubenswrapper[4677]: E1007 13:21:37.396291 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="registry-server" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396303 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="registry-server" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396579 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" containerName="extract" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.396622 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="687f0d4d-64c0-4d3f-b50f-899993b96168" containerName="registry-server" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.398544 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.410590 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k2v9"] Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.524587 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-utilities\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.524664 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gth77\" (UniqueName: \"kubernetes.io/projected/65080209-d321-4dee-b020-d47f33f7c408-kube-api-access-gth77\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.524703 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-catalog-content\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.626210 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-utilities\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.626286 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gth77\" (UniqueName: \"kubernetes.io/projected/65080209-d321-4dee-b020-d47f33f7c408-kube-api-access-gth77\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.626329 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-catalog-content\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.626797 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-catalog-content\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.626871 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-utilities\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.653815 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gth77\" (UniqueName: \"kubernetes.io/projected/65080209-d321-4dee-b020-d47f33f7c408-kube-api-access-gth77\") pod \"redhat-marketplace-9k2v9\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:37 crc kubenswrapper[4677]: I1007 13:21:37.737934 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:38 crc kubenswrapper[4677]: I1007 13:21:38.252701 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k2v9"] Oct 07 13:21:38 crc kubenswrapper[4677]: W1007 13:21:38.272242 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65080209_d321_4dee_b020_d47f33f7c408.slice/crio-34d8ce52f9755085507540e999256de1fa6b2051774df9f08c145ad873f78769 WatchSource:0}: Error finding container 34d8ce52f9755085507540e999256de1fa6b2051774df9f08c145ad873f78769: Status 404 returned error can't find the container with id 34d8ce52f9755085507540e999256de1fa6b2051774df9f08c145ad873f78769 Oct 07 13:21:39 crc kubenswrapper[4677]: I1007 13:21:39.170306 4677 generic.go:334] "Generic (PLEG): container finished" podID="65080209-d321-4dee-b020-d47f33f7c408" containerID="d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf" exitCode=0 Oct 07 13:21:39 crc kubenswrapper[4677]: I1007 13:21:39.170394 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k2v9" event={"ID":"65080209-d321-4dee-b020-d47f33f7c408","Type":"ContainerDied","Data":"d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf"} Oct 07 13:21:39 crc kubenswrapper[4677]: I1007 13:21:39.170708 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k2v9" event={"ID":"65080209-d321-4dee-b020-d47f33f7c408","Type":"ContainerStarted","Data":"34d8ce52f9755085507540e999256de1fa6b2051774df9f08c145ad873f78769"} Oct 07 13:21:40 crc kubenswrapper[4677]: I1007 13:21:40.195245 4677 generic.go:334] "Generic (PLEG): container finished" podID="65080209-d321-4dee-b020-d47f33f7c408" containerID="8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f" exitCode=0 Oct 07 13:21:40 crc kubenswrapper[4677]: I1007 13:21:40.195543 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k2v9" event={"ID":"65080209-d321-4dee-b020-d47f33f7c408","Type":"ContainerDied","Data":"8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f"} Oct 07 13:21:41 crc kubenswrapper[4677]: I1007 13:21:41.206300 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k2v9" event={"ID":"65080209-d321-4dee-b020-d47f33f7c408","Type":"ContainerStarted","Data":"8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd"} Oct 07 13:21:41 crc kubenswrapper[4677]: I1007 13:21:41.225649 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9k2v9" podStartSLOduration=2.80438161 podStartE2EDuration="4.225620764s" podCreationTimestamp="2025-10-07 13:21:37 +0000 UTC" firstStartedPulling="2025-10-07 13:21:39.1728139 +0000 UTC m=+870.658523015" lastFinishedPulling="2025-10-07 13:21:40.594053044 +0000 UTC m=+872.079762169" observedRunningTime="2025-10-07 13:21:41.225112349 +0000 UTC m=+872.710821505" watchObservedRunningTime="2025-10-07 13:21:41.225620764 +0000 UTC m=+872.711329879" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.402111 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b"] Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.403200 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.405133 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.405534 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qd8w7" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.417888 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b"] Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.499417 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwbp\" (UniqueName: \"kubernetes.io/projected/5f1035cf-bd50-4425-a342-92a71ab7f16e-kube-api-access-fzwbp\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.499570 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-webhook-cert\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.499651 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-apiservice-cert\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.601037 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-apiservice-cert\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.601077 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwbp\" (UniqueName: \"kubernetes.io/projected/5f1035cf-bd50-4425-a342-92a71ab7f16e-kube-api-access-fzwbp\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.601132 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-webhook-cert\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.607152 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-apiservice-cert\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.614075 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-webhook-cert\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.618736 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwbp\" (UniqueName: \"kubernetes.io/projected/5f1035cf-bd50-4425-a342-92a71ab7f16e-kube-api-access-fzwbp\") pod \"keystone-operator-controller-manager-54986549bb-c7n9b\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:42 crc kubenswrapper[4677]: I1007 13:21:42.721456 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:43 crc kubenswrapper[4677]: I1007 13:21:43.211707 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b"] Oct 07 13:21:44 crc kubenswrapper[4677]: I1007 13:21:44.237851 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" event={"ID":"5f1035cf-bd50-4425-a342-92a71ab7f16e","Type":"ContainerStarted","Data":"bc3bb6d890a0a2f2a5b6b0ced856021f9911ef39c61f806ebc9c04bfd0024754"} Oct 07 13:21:45 crc kubenswrapper[4677]: I1007 13:21:45.245122 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" event={"ID":"5f1035cf-bd50-4425-a342-92a71ab7f16e","Type":"ContainerStarted","Data":"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205"} Oct 07 13:21:45 crc kubenswrapper[4677]: I1007 13:21:45.245370 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" event={"ID":"5f1035cf-bd50-4425-a342-92a71ab7f16e","Type":"ContainerStarted","Data":"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079"} Oct 07 13:21:45 crc kubenswrapper[4677]: I1007 13:21:45.245386 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:45 crc kubenswrapper[4677]: I1007 13:21:45.267971 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" podStartSLOduration=1.4374592480000001 podStartE2EDuration="3.267942943s" podCreationTimestamp="2025-10-07 13:21:42 +0000 UTC" firstStartedPulling="2025-10-07 13:21:43.220714689 +0000 UTC m=+874.706423814" lastFinishedPulling="2025-10-07 13:21:45.051198364 +0000 UTC m=+876.536907509" observedRunningTime="2025-10-07 13:21:45.260780877 +0000 UTC m=+876.746490002" watchObservedRunningTime="2025-10-07 13:21:45.267942943 +0000 UTC m=+876.753652078" Oct 07 13:21:47 crc kubenswrapper[4677]: I1007 13:21:47.738576 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:47 crc kubenswrapper[4677]: I1007 13:21:47.738921 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:47 crc kubenswrapper[4677]: I1007 13:21:47.815818 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:48 crc kubenswrapper[4677]: I1007 13:21:48.340141 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:49 crc kubenswrapper[4677]: I1007 13:21:49.274377 4677 generic.go:334] "Generic (PLEG): container finished" podID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerID="4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11" exitCode=0 Oct 07 13:21:49 crc kubenswrapper[4677]: I1007 13:21:49.274502 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"c50e7113-f37a-4ea1-9d53-c53106564a48","Type":"ContainerDied","Data":"4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11"} Oct 07 13:21:50 crc kubenswrapper[4677]: I1007 13:21:50.287073 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"c50e7113-f37a-4ea1-9d53-c53106564a48","Type":"ContainerStarted","Data":"f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf"} Oct 07 13:21:50 crc kubenswrapper[4677]: I1007 13:21:50.287727 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:21:50 crc kubenswrapper[4677]: I1007 13:21:50.321342 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.126311259 podStartE2EDuration="42.321316356s" podCreationTimestamp="2025-10-07 13:21:08 +0000 UTC" firstStartedPulling="2025-10-07 13:21:09.96354456 +0000 UTC m=+841.449253715" lastFinishedPulling="2025-10-07 13:21:16.158549697 +0000 UTC m=+847.644258812" observedRunningTime="2025-10-07 13:21:50.311265707 +0000 UTC m=+881.796974912" watchObservedRunningTime="2025-10-07 13:21:50.321316356 +0000 UTC m=+881.807025501" Oct 07 13:21:52 crc kubenswrapper[4677]: I1007 13:21:52.728005 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:21:52 crc kubenswrapper[4677]: I1007 13:21:52.790838 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k2v9"] Oct 07 13:21:52 crc kubenswrapper[4677]: I1007 13:21:52.791156 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9k2v9" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="registry-server" containerID="cri-o://8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd" gracePeriod=2 Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.216854 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.314133 4677 generic.go:334] "Generic (PLEG): container finished" podID="65080209-d321-4dee-b020-d47f33f7c408" containerID="8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd" exitCode=0 Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.314237 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k2v9" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.314947 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k2v9" event={"ID":"65080209-d321-4dee-b020-d47f33f7c408","Type":"ContainerDied","Data":"8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd"} Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.314990 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k2v9" event={"ID":"65080209-d321-4dee-b020-d47f33f7c408","Type":"ContainerDied","Data":"34d8ce52f9755085507540e999256de1fa6b2051774df9f08c145ad873f78769"} Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.315012 4677 scope.go:117] "RemoveContainer" containerID="8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.329665 4677 scope.go:117] "RemoveContainer" containerID="8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.348859 4677 scope.go:117] "RemoveContainer" containerID="d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.359890 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-catalog-content\") pod \"65080209-d321-4dee-b020-d47f33f7c408\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.360024 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gth77\" (UniqueName: \"kubernetes.io/projected/65080209-d321-4dee-b020-d47f33f7c408-kube-api-access-gth77\") pod \"65080209-d321-4dee-b020-d47f33f7c408\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.360138 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-utilities\") pod \"65080209-d321-4dee-b020-d47f33f7c408\" (UID: \"65080209-d321-4dee-b020-d47f33f7c408\") " Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.361274 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-utilities" (OuterVolumeSpecName: "utilities") pod "65080209-d321-4dee-b020-d47f33f7c408" (UID: "65080209-d321-4dee-b020-d47f33f7c408"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.370698 4677 scope.go:117] "RemoveContainer" containerID="8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.370700 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65080209-d321-4dee-b020-d47f33f7c408-kube-api-access-gth77" (OuterVolumeSpecName: "kube-api-access-gth77") pod "65080209-d321-4dee-b020-d47f33f7c408" (UID: "65080209-d321-4dee-b020-d47f33f7c408"). InnerVolumeSpecName "kube-api-access-gth77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:53 crc kubenswrapper[4677]: E1007 13:21:53.371856 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd\": container with ID starting with 8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd not found: ID does not exist" containerID="8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.371917 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd"} err="failed to get container status \"8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd\": rpc error: code = NotFound desc = could not find container \"8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd\": container with ID starting with 8e02c35f035965c57c7d5156ebfd2259102b1b799b3e1f57d3d9b180ab6543dd not found: ID does not exist" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.371953 4677 scope.go:117] "RemoveContainer" containerID="8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f" Oct 07 13:21:53 crc kubenswrapper[4677]: E1007 13:21:53.372739 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f\": container with ID starting with 8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f not found: ID does not exist" containerID="8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.372785 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f"} err="failed to get container status \"8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f\": rpc error: code = NotFound desc = could not find container \"8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f\": container with ID starting with 8b5b6c3d8d62ffa997ba4829c1b6b2d9692ac48fdf04050828e973cb724ee24f not found: ID does not exist" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.372810 4677 scope.go:117] "RemoveContainer" containerID="d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf" Oct 07 13:21:53 crc kubenswrapper[4677]: E1007 13:21:53.373802 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf\": container with ID starting with d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf not found: ID does not exist" containerID="d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.373842 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf"} err="failed to get container status \"d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf\": rpc error: code = NotFound desc = could not find container \"d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf\": container with ID starting with d6e18ec07b05ff9e2ad763d8a9c7c24e6339c97aa4d2550dd756f2e046e92ccf not found: ID does not exist" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.382489 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65080209-d321-4dee-b020-d47f33f7c408" (UID: "65080209-d321-4dee-b020-d47f33f7c408"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.462175 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.462209 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65080209-d321-4dee-b020-d47f33f7c408-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.462222 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gth77\" (UniqueName: \"kubernetes.io/projected/65080209-d321-4dee-b020-d47f33f7c408-kube-api-access-gth77\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.653639 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k2v9"] Oct 07 13:21:53 crc kubenswrapper[4677]: I1007 13:21:53.662731 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k2v9"] Oct 07 13:21:55 crc kubenswrapper[4677]: I1007 13:21:55.315559 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65080209-d321-4dee-b020-d47f33f7c408" path="/var/lib/kubelet/pods/65080209-d321-4dee-b020-d47f33f7c408/volumes" Oct 07 13:21:57 crc kubenswrapper[4677]: I1007 13:21:57.906900 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-bt8wg"] Oct 07 13:21:57 crc kubenswrapper[4677]: E1007 13:21:57.907792 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="extract-utilities" Oct 07 13:21:57 crc kubenswrapper[4677]: I1007 13:21:57.907823 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="extract-utilities" Oct 07 13:21:57 crc kubenswrapper[4677]: E1007 13:21:57.907880 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="extract-content" Oct 07 13:21:57 crc kubenswrapper[4677]: I1007 13:21:57.907898 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="extract-content" Oct 07 13:21:57 crc kubenswrapper[4677]: E1007 13:21:57.907924 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="registry-server" Oct 07 13:21:57 crc kubenswrapper[4677]: I1007 13:21:57.907983 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="registry-server" Oct 07 13:21:57 crc kubenswrapper[4677]: I1007 13:21:57.908270 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="65080209-d321-4dee-b020-d47f33f7c408" containerName="registry-server" Oct 07 13:21:57 crc kubenswrapper[4677]: I1007 13:21:57.909221 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-bt8wg" Oct 07 13:21:57 crc kubenswrapper[4677]: I1007 13:21:57.918544 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-bt8wg"] Oct 07 13:21:58 crc kubenswrapper[4677]: I1007 13:21:58.026821 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxqb\" (UniqueName: \"kubernetes.io/projected/bf3722f2-ce90-446b-9310-211efd91af37-kube-api-access-fjxqb\") pod \"keystone-db-create-bt8wg\" (UID: \"bf3722f2-ce90-446b-9310-211efd91af37\") " pod="keystone-kuttl-tests/keystone-db-create-bt8wg" Oct 07 13:21:58 crc kubenswrapper[4677]: I1007 13:21:58.128748 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxqb\" (UniqueName: \"kubernetes.io/projected/bf3722f2-ce90-446b-9310-211efd91af37-kube-api-access-fjxqb\") pod \"keystone-db-create-bt8wg\" (UID: \"bf3722f2-ce90-446b-9310-211efd91af37\") " pod="keystone-kuttl-tests/keystone-db-create-bt8wg" Oct 07 13:21:58 crc kubenswrapper[4677]: I1007 13:21:58.159131 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxqb\" (UniqueName: \"kubernetes.io/projected/bf3722f2-ce90-446b-9310-211efd91af37-kube-api-access-fjxqb\") pod \"keystone-db-create-bt8wg\" (UID: \"bf3722f2-ce90-446b-9310-211efd91af37\") " pod="keystone-kuttl-tests/keystone-db-create-bt8wg" Oct 07 13:21:58 crc kubenswrapper[4677]: I1007 13:21:58.234665 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-bt8wg" Oct 07 13:21:58 crc kubenswrapper[4677]: I1007 13:21:58.659908 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-bt8wg"] Oct 07 13:21:59 crc kubenswrapper[4677]: I1007 13:21:59.358216 4677 generic.go:334] "Generic (PLEG): container finished" podID="bf3722f2-ce90-446b-9310-211efd91af37" containerID="40844dfa58017a0eeef941d1e032645d31a457ef288d1035463cd3709434c0c6" exitCode=0 Oct 07 13:21:59 crc kubenswrapper[4677]: I1007 13:21:59.358309 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-bt8wg" event={"ID":"bf3722f2-ce90-446b-9310-211efd91af37","Type":"ContainerDied","Data":"40844dfa58017a0eeef941d1e032645d31a457ef288d1035463cd3709434c0c6"} Oct 07 13:21:59 crc kubenswrapper[4677]: I1007 13:21:59.359690 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-bt8wg" event={"ID":"bf3722f2-ce90-446b-9310-211efd91af37","Type":"ContainerStarted","Data":"e9e1c516f3e061c9de56a906341a6a707cf4da946e2954fee0e6c84d6d8e6bdb"} Oct 07 13:21:59 crc kubenswrapper[4677]: I1007 13:21:59.530641 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:22:00 crc kubenswrapper[4677]: I1007 13:22:00.714131 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-bt8wg" Oct 07 13:22:00 crc kubenswrapper[4677]: I1007 13:22:00.869950 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjxqb\" (UniqueName: \"kubernetes.io/projected/bf3722f2-ce90-446b-9310-211efd91af37-kube-api-access-fjxqb\") pod \"bf3722f2-ce90-446b-9310-211efd91af37\" (UID: \"bf3722f2-ce90-446b-9310-211efd91af37\") " Oct 07 13:22:00 crc kubenswrapper[4677]: I1007 13:22:00.878128 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3722f2-ce90-446b-9310-211efd91af37-kube-api-access-fjxqb" (OuterVolumeSpecName: "kube-api-access-fjxqb") pod "bf3722f2-ce90-446b-9310-211efd91af37" (UID: "bf3722f2-ce90-446b-9310-211efd91af37"). InnerVolumeSpecName "kube-api-access-fjxqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:22:00 crc kubenswrapper[4677]: I1007 13:22:00.973679 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjxqb\" (UniqueName: \"kubernetes.io/projected/bf3722f2-ce90-446b-9310-211efd91af37-kube-api-access-fjxqb\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:01 crc kubenswrapper[4677]: I1007 13:22:01.375494 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-bt8wg" event={"ID":"bf3722f2-ce90-446b-9310-211efd91af37","Type":"ContainerDied","Data":"e9e1c516f3e061c9de56a906341a6a707cf4da946e2954fee0e6c84d6d8e6bdb"} Oct 07 13:22:01 crc kubenswrapper[4677]: I1007 13:22:01.375540 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e1c516f3e061c9de56a906341a6a707cf4da946e2954fee0e6c84d6d8e6bdb" Oct 07 13:22:01 crc kubenswrapper[4677]: I1007 13:22:01.375601 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-bt8wg" Oct 07 13:22:07 crc kubenswrapper[4677]: I1007 13:22:07.807234 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-e90f-account-create-bh98g"] Oct 07 13:22:07 crc kubenswrapper[4677]: E1007 13:22:07.808211 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3722f2-ce90-446b-9310-211efd91af37" containerName="mariadb-database-create" Oct 07 13:22:07 crc kubenswrapper[4677]: I1007 13:22:07.808233 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3722f2-ce90-446b-9310-211efd91af37" containerName="mariadb-database-create" Oct 07 13:22:07 crc kubenswrapper[4677]: I1007 13:22:07.808470 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3722f2-ce90-446b-9310-211efd91af37" containerName="mariadb-database-create" Oct 07 13:22:07 crc kubenswrapper[4677]: I1007 13:22:07.809182 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" Oct 07 13:22:07 crc kubenswrapper[4677]: I1007 13:22:07.811561 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 07 13:22:07 crc kubenswrapper[4677]: I1007 13:22:07.827768 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-e90f-account-create-bh98g"] Oct 07 13:22:07 crc kubenswrapper[4677]: I1007 13:22:07.974291 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpgmk\" (UniqueName: \"kubernetes.io/projected/f0671a0f-965b-495f-882a-8b892ca011c4-kube-api-access-fpgmk\") pod \"keystone-e90f-account-create-bh98g\" (UID: \"f0671a0f-965b-495f-882a-8b892ca011c4\") " pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" Oct 07 13:22:08 crc kubenswrapper[4677]: I1007 13:22:08.076396 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpgmk\" (UniqueName: \"kubernetes.io/projected/f0671a0f-965b-495f-882a-8b892ca011c4-kube-api-access-fpgmk\") pod \"keystone-e90f-account-create-bh98g\" (UID: \"f0671a0f-965b-495f-882a-8b892ca011c4\") " pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" Oct 07 13:22:08 crc kubenswrapper[4677]: I1007 13:22:08.108601 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpgmk\" (UniqueName: \"kubernetes.io/projected/f0671a0f-965b-495f-882a-8b892ca011c4-kube-api-access-fpgmk\") pod \"keystone-e90f-account-create-bh98g\" (UID: \"f0671a0f-965b-495f-882a-8b892ca011c4\") " pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" Oct 07 13:22:08 crc kubenswrapper[4677]: I1007 13:22:08.133346 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" Oct 07 13:22:08 crc kubenswrapper[4677]: I1007 13:22:08.420522 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-e90f-account-create-bh98g"] Oct 07 13:22:08 crc kubenswrapper[4677]: W1007 13:22:08.431660 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0671a0f_965b_495f_882a_8b892ca011c4.slice/crio-85b777cb209e56af67b4d3dc13edc2bcc65f0cfdcbca3bb0888619a5c6cd0571 WatchSource:0}: Error finding container 85b777cb209e56af67b4d3dc13edc2bcc65f0cfdcbca3bb0888619a5c6cd0571: Status 404 returned error can't find the container with id 85b777cb209e56af67b4d3dc13edc2bcc65f0cfdcbca3bb0888619a5c6cd0571 Oct 07 13:22:08 crc kubenswrapper[4677]: I1007 13:22:08.440827 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" event={"ID":"f0671a0f-965b-495f-882a-8b892ca011c4","Type":"ContainerStarted","Data":"85b777cb209e56af67b4d3dc13edc2bcc65f0cfdcbca3bb0888619a5c6cd0571"} Oct 07 13:22:09 crc kubenswrapper[4677]: I1007 13:22:09.456237 4677 generic.go:334] "Generic (PLEG): container finished" podID="f0671a0f-965b-495f-882a-8b892ca011c4" containerID="37d41ffe7a8114750e66197238c19584d3771fc02d844b75cdcb4df7324ac026" exitCode=0 Oct 07 13:22:09 crc kubenswrapper[4677]: I1007 13:22:09.456342 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" event={"ID":"f0671a0f-965b-495f-882a-8b892ca011c4","Type":"ContainerDied","Data":"37d41ffe7a8114750e66197238c19584d3771fc02d844b75cdcb4df7324ac026"} Oct 07 13:22:10 crc kubenswrapper[4677]: I1007 13:22:10.768310 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" Oct 07 13:22:10 crc kubenswrapper[4677]: I1007 13:22:10.924261 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpgmk\" (UniqueName: \"kubernetes.io/projected/f0671a0f-965b-495f-882a-8b892ca011c4-kube-api-access-fpgmk\") pod \"f0671a0f-965b-495f-882a-8b892ca011c4\" (UID: \"f0671a0f-965b-495f-882a-8b892ca011c4\") " Oct 07 13:22:10 crc kubenswrapper[4677]: I1007 13:22:10.932740 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0671a0f-965b-495f-882a-8b892ca011c4-kube-api-access-fpgmk" (OuterVolumeSpecName: "kube-api-access-fpgmk") pod "f0671a0f-965b-495f-882a-8b892ca011c4" (UID: "f0671a0f-965b-495f-882a-8b892ca011c4"). InnerVolumeSpecName "kube-api-access-fpgmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:22:11 crc kubenswrapper[4677]: I1007 13:22:11.026126 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpgmk\" (UniqueName: \"kubernetes.io/projected/f0671a0f-965b-495f-882a-8b892ca011c4-kube-api-access-fpgmk\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:11 crc kubenswrapper[4677]: I1007 13:22:11.470341 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" event={"ID":"f0671a0f-965b-495f-882a-8b892ca011c4","Type":"ContainerDied","Data":"85b777cb209e56af67b4d3dc13edc2bcc65f0cfdcbca3bb0888619a5c6cd0571"} Oct 07 13:22:11 crc kubenswrapper[4677]: I1007 13:22:11.470381 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b777cb209e56af67b4d3dc13edc2bcc65f0cfdcbca3bb0888619a5c6cd0571" Oct 07 13:22:11 crc kubenswrapper[4677]: I1007 13:22:11.470389 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e90f-account-create-bh98g" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.367881 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-n4qcj"] Oct 07 13:22:13 crc kubenswrapper[4677]: E1007 13:22:13.368810 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0671a0f-965b-495f-882a-8b892ca011c4" containerName="mariadb-account-create" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.368842 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0671a0f-965b-495f-882a-8b892ca011c4" containerName="mariadb-account-create" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.369107 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0671a0f-965b-495f-882a-8b892ca011c4" containerName="mariadb-account-create" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.370334 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.372604 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.372965 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lrcpv" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.373305 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.379667 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.384206 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-n4qcj"] Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.463572 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfcll\" (UniqueName: \"kubernetes.io/projected/c1b1942d-b0d7-4d01-8166-7a78f0908987-kube-api-access-zfcll\") pod \"keystone-db-sync-n4qcj\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.463638 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b1942d-b0d7-4d01-8166-7a78f0908987-config-data\") pod \"keystone-db-sync-n4qcj\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.566553 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfcll\" (UniqueName: \"kubernetes.io/projected/c1b1942d-b0d7-4d01-8166-7a78f0908987-kube-api-access-zfcll\") pod \"keystone-db-sync-n4qcj\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.566620 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b1942d-b0d7-4d01-8166-7a78f0908987-config-data\") pod \"keystone-db-sync-n4qcj\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.580551 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b1942d-b0d7-4d01-8166-7a78f0908987-config-data\") pod \"keystone-db-sync-n4qcj\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.586356 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfcll\" (UniqueName: \"kubernetes.io/projected/c1b1942d-b0d7-4d01-8166-7a78f0908987-kube-api-access-zfcll\") pod \"keystone-db-sync-n4qcj\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.703868 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:13 crc kubenswrapper[4677]: I1007 13:22:13.958678 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-n4qcj"] Oct 07 13:22:13 crc kubenswrapper[4677]: W1007 13:22:13.968485 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1b1942d_b0d7_4d01_8166_7a78f0908987.slice/crio-637d09dfffed096bc46487a623e9f8d519476a73b6434a59a4213fc5b55dac56 WatchSource:0}: Error finding container 637d09dfffed096bc46487a623e9f8d519476a73b6434a59a4213fc5b55dac56: Status 404 returned error can't find the container with id 637d09dfffed096bc46487a623e9f8d519476a73b6434a59a4213fc5b55dac56 Oct 07 13:22:14 crc kubenswrapper[4677]: I1007 13:22:14.494755 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" event={"ID":"c1b1942d-b0d7-4d01-8166-7a78f0908987","Type":"ContainerStarted","Data":"637d09dfffed096bc46487a623e9f8d519476a73b6434a59a4213fc5b55dac56"} Oct 07 13:22:21 crc kubenswrapper[4677]: I1007 13:22:21.548596 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" event={"ID":"c1b1942d-b0d7-4d01-8166-7a78f0908987","Type":"ContainerStarted","Data":"337a35d3da0185088efc41bb34b6262c81c262ab4264f9d1541567d20d14ab88"} Oct 07 13:22:21 crc kubenswrapper[4677]: I1007 13:22:21.579052 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" podStartSLOduration=2.178055363 podStartE2EDuration="8.579022646s" podCreationTimestamp="2025-10-07 13:22:13 +0000 UTC" firstStartedPulling="2025-10-07 13:22:13.984454997 +0000 UTC m=+905.470164142" lastFinishedPulling="2025-10-07 13:22:20.38542231 +0000 UTC m=+911.871131425" observedRunningTime="2025-10-07 13:22:21.569677617 +0000 UTC m=+913.055386792" watchObservedRunningTime="2025-10-07 13:22:21.579022646 +0000 UTC m=+913.064731831" Oct 07 13:22:23 crc kubenswrapper[4677]: I1007 13:22:23.567048 4677 generic.go:334] "Generic (PLEG): container finished" podID="c1b1942d-b0d7-4d01-8166-7a78f0908987" containerID="337a35d3da0185088efc41bb34b6262c81c262ab4264f9d1541567d20d14ab88" exitCode=0 Oct 07 13:22:23 crc kubenswrapper[4677]: I1007 13:22:23.567256 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" event={"ID":"c1b1942d-b0d7-4d01-8166-7a78f0908987","Type":"ContainerDied","Data":"337a35d3da0185088efc41bb34b6262c81c262ab4264f9d1541567d20d14ab88"} Oct 07 13:22:24 crc kubenswrapper[4677]: I1007 13:22:24.856664 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:24 crc kubenswrapper[4677]: I1007 13:22:24.942497 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b1942d-b0d7-4d01-8166-7a78f0908987-config-data\") pod \"c1b1942d-b0d7-4d01-8166-7a78f0908987\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " Oct 07 13:22:24 crc kubenswrapper[4677]: I1007 13:22:24.942583 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfcll\" (UniqueName: \"kubernetes.io/projected/c1b1942d-b0d7-4d01-8166-7a78f0908987-kube-api-access-zfcll\") pod \"c1b1942d-b0d7-4d01-8166-7a78f0908987\" (UID: \"c1b1942d-b0d7-4d01-8166-7a78f0908987\") " Oct 07 13:22:24 crc kubenswrapper[4677]: I1007 13:22:24.948983 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b1942d-b0d7-4d01-8166-7a78f0908987-kube-api-access-zfcll" (OuterVolumeSpecName: "kube-api-access-zfcll") pod "c1b1942d-b0d7-4d01-8166-7a78f0908987" (UID: "c1b1942d-b0d7-4d01-8166-7a78f0908987"). InnerVolumeSpecName "kube-api-access-zfcll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:22:24 crc kubenswrapper[4677]: I1007 13:22:24.979685 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b1942d-b0d7-4d01-8166-7a78f0908987-config-data" (OuterVolumeSpecName: "config-data") pod "c1b1942d-b0d7-4d01-8166-7a78f0908987" (UID: "c1b1942d-b0d7-4d01-8166-7a78f0908987"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.044001 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b1942d-b0d7-4d01-8166-7a78f0908987-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.044040 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfcll\" (UniqueName: \"kubernetes.io/projected/c1b1942d-b0d7-4d01-8166-7a78f0908987-kube-api-access-zfcll\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.583879 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" event={"ID":"c1b1942d-b0d7-4d01-8166-7a78f0908987","Type":"ContainerDied","Data":"637d09dfffed096bc46487a623e9f8d519476a73b6434a59a4213fc5b55dac56"} Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.584483 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="637d09dfffed096bc46487a623e9f8d519476a73b6434a59a4213fc5b55dac56" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.583986 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-n4qcj" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.798493 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cdp9l"] Oct 07 13:22:25 crc kubenswrapper[4677]: E1007 13:22:25.798881 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b1942d-b0d7-4d01-8166-7a78f0908987" containerName="keystone-db-sync" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.798907 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b1942d-b0d7-4d01-8166-7a78f0908987" containerName="keystone-db-sync" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.799130 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b1942d-b0d7-4d01-8166-7a78f0908987" containerName="keystone-db-sync" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.799866 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.803502 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.803674 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.803795 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lrcpv" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.803911 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.809581 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cdp9l"] Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.958916 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-credential-keys\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.958961 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-scripts\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.959004 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-fernet-keys\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.959037 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94l6n\" (UniqueName: \"kubernetes.io/projected/bb86344d-27e2-4b68-b0ca-06f6622b6666-kube-api-access-94l6n\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:25 crc kubenswrapper[4677]: I1007 13:22:25.959072 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-config-data\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.060413 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-credential-keys\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.060470 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-scripts\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.060503 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-fernet-keys\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.060523 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94l6n\" (UniqueName: \"kubernetes.io/projected/bb86344d-27e2-4b68-b0ca-06f6622b6666-kube-api-access-94l6n\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.060550 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-config-data\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.065520 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-credential-keys\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.067755 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-scripts\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.068635 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-fernet-keys\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.078543 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-config-data\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.083850 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94l6n\" (UniqueName: \"kubernetes.io/projected/bb86344d-27e2-4b68-b0ca-06f6622b6666-kube-api-access-94l6n\") pod \"keystone-bootstrap-cdp9l\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.116381 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.372466 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cdp9l"] Oct 07 13:22:26 crc kubenswrapper[4677]: W1007 13:22:26.380737 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb86344d_27e2_4b68_b0ca_06f6622b6666.slice/crio-412884f6e6745caefc80a05df5df0e4bfeb94eec3701578631c3babfabdbe97f WatchSource:0}: Error finding container 412884f6e6745caefc80a05df5df0e4bfeb94eec3701578631c3babfabdbe97f: Status 404 returned error can't find the container with id 412884f6e6745caefc80a05df5df0e4bfeb94eec3701578631c3babfabdbe97f Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.611672 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" event={"ID":"bb86344d-27e2-4b68-b0ca-06f6622b6666","Type":"ContainerStarted","Data":"21984cfe565a162691ddd942355e9162c94f0ef54b6859454c7da828d9d312ca"} Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.611738 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" event={"ID":"bb86344d-27e2-4b68-b0ca-06f6622b6666","Type":"ContainerStarted","Data":"412884f6e6745caefc80a05df5df0e4bfeb94eec3701578631c3babfabdbe97f"} Oct 07 13:22:26 crc kubenswrapper[4677]: I1007 13:22:26.639865 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" podStartSLOduration=1.639842748 podStartE2EDuration="1.639842748s" podCreationTimestamp="2025-10-07 13:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:22:26.63678331 +0000 UTC m=+918.122492425" watchObservedRunningTime="2025-10-07 13:22:26.639842748 +0000 UTC m=+918.125551903" Oct 07 13:22:29 crc kubenswrapper[4677]: I1007 13:22:29.634712 4677 generic.go:334] "Generic (PLEG): container finished" podID="bb86344d-27e2-4b68-b0ca-06f6622b6666" containerID="21984cfe565a162691ddd942355e9162c94f0ef54b6859454c7da828d9d312ca" exitCode=0 Oct 07 13:22:29 crc kubenswrapper[4677]: I1007 13:22:29.634821 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" event={"ID":"bb86344d-27e2-4b68-b0ca-06f6622b6666","Type":"ContainerDied","Data":"21984cfe565a162691ddd942355e9162c94f0ef54b6859454c7da828d9d312ca"} Oct 07 13:22:30 crc kubenswrapper[4677]: I1007 13:22:30.993539 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.130344 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94l6n\" (UniqueName: \"kubernetes.io/projected/bb86344d-27e2-4b68-b0ca-06f6622b6666-kube-api-access-94l6n\") pod \"bb86344d-27e2-4b68-b0ca-06f6622b6666\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.130444 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-scripts\") pod \"bb86344d-27e2-4b68-b0ca-06f6622b6666\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.130527 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-fernet-keys\") pod \"bb86344d-27e2-4b68-b0ca-06f6622b6666\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.130584 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-credential-keys\") pod \"bb86344d-27e2-4b68-b0ca-06f6622b6666\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.131515 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-config-data\") pod \"bb86344d-27e2-4b68-b0ca-06f6622b6666\" (UID: \"bb86344d-27e2-4b68-b0ca-06f6622b6666\") " Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.138720 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bb86344d-27e2-4b68-b0ca-06f6622b6666" (UID: "bb86344d-27e2-4b68-b0ca-06f6622b6666"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.139797 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb86344d-27e2-4b68-b0ca-06f6622b6666-kube-api-access-94l6n" (OuterVolumeSpecName: "kube-api-access-94l6n") pod "bb86344d-27e2-4b68-b0ca-06f6622b6666" (UID: "bb86344d-27e2-4b68-b0ca-06f6622b6666"). InnerVolumeSpecName "kube-api-access-94l6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.148325 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bb86344d-27e2-4b68-b0ca-06f6622b6666" (UID: "bb86344d-27e2-4b68-b0ca-06f6622b6666"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.149018 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-scripts" (OuterVolumeSpecName: "scripts") pod "bb86344d-27e2-4b68-b0ca-06f6622b6666" (UID: "bb86344d-27e2-4b68-b0ca-06f6622b6666"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.156635 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-config-data" (OuterVolumeSpecName: "config-data") pod "bb86344d-27e2-4b68-b0ca-06f6622b6666" (UID: "bb86344d-27e2-4b68-b0ca-06f6622b6666"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.232982 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.233039 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.233071 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.233092 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb86344d-27e2-4b68-b0ca-06f6622b6666-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.233110 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94l6n\" (UniqueName: \"kubernetes.io/projected/bb86344d-27e2-4b68-b0ca-06f6622b6666-kube-api-access-94l6n\") on node \"crc\" DevicePath \"\"" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.653613 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" event={"ID":"bb86344d-27e2-4b68-b0ca-06f6622b6666","Type":"ContainerDied","Data":"412884f6e6745caefc80a05df5df0e4bfeb94eec3701578631c3babfabdbe97f"} Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.654055 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="412884f6e6745caefc80a05df5df0e4bfeb94eec3701578631c3babfabdbe97f" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.653726 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cdp9l" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.849122 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8"] Oct 07 13:22:31 crc kubenswrapper[4677]: E1007 13:22:31.849405 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb86344d-27e2-4b68-b0ca-06f6622b6666" containerName="keystone-bootstrap" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.849420 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb86344d-27e2-4b68-b0ca-06f6622b6666" containerName="keystone-bootstrap" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.849595 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb86344d-27e2-4b68-b0ca-06f6622b6666" containerName="keystone-bootstrap" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.850099 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.853099 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.853594 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.854148 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.854225 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lrcpv" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.881733 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8"] Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.943094 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrbb7\" (UniqueName: \"kubernetes.io/projected/a1e620e5-bd71-4d49-a2d6-a9106f477de2-kube-api-access-wrbb7\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.943203 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-scripts\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.943292 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-credential-keys\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.943407 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-fernet-keys\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:31 crc kubenswrapper[4677]: I1007 13:22:31.943549 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-config-data\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.045172 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-fernet-keys\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.045254 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-config-data\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.045367 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrbb7\" (UniqueName: \"kubernetes.io/projected/a1e620e5-bd71-4d49-a2d6-a9106f477de2-kube-api-access-wrbb7\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.045473 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-scripts\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.045509 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-credential-keys\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.050419 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-scripts\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.050532 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-config-data\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.051400 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-fernet-keys\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.051710 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-credential-keys\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.063654 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrbb7\" (UniqueName: \"kubernetes.io/projected/a1e620e5-bd71-4d49-a2d6-a9106f477de2-kube-api-access-wrbb7\") pod \"keystone-67d5b9b9b4-m4tl8\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.174060 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:32 crc kubenswrapper[4677]: W1007 13:22:32.610149 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1e620e5_bd71_4d49_a2d6_a9106f477de2.slice/crio-baccb333f269fee501d5249db8b014fdce1ac92bf75aabc9375e1dcf48fb7c96 WatchSource:0}: Error finding container baccb333f269fee501d5249db8b014fdce1ac92bf75aabc9375e1dcf48fb7c96: Status 404 returned error can't find the container with id baccb333f269fee501d5249db8b014fdce1ac92bf75aabc9375e1dcf48fb7c96 Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.610199 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8"] Oct 07 13:22:32 crc kubenswrapper[4677]: I1007 13:22:32.664468 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" event={"ID":"a1e620e5-bd71-4d49-a2d6-a9106f477de2","Type":"ContainerStarted","Data":"baccb333f269fee501d5249db8b014fdce1ac92bf75aabc9375e1dcf48fb7c96"} Oct 07 13:22:33 crc kubenswrapper[4677]: I1007 13:22:33.673701 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" event={"ID":"a1e620e5-bd71-4d49-a2d6-a9106f477de2","Type":"ContainerStarted","Data":"9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0"} Oct 07 13:22:33 crc kubenswrapper[4677]: I1007 13:22:33.674458 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:22:33 crc kubenswrapper[4677]: I1007 13:22:33.704609 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" podStartSLOduration=2.704577379 podStartE2EDuration="2.704577379s" podCreationTimestamp="2025-10-07 13:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:22:33.69869218 +0000 UTC m=+925.184401295" watchObservedRunningTime="2025-10-07 13:22:33.704577379 +0000 UTC m=+925.190286534" Oct 07 13:23:03 crc kubenswrapper[4677]: I1007 13:23:03.549344 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:23:04 crc kubenswrapper[4677]: E1007 13:23:04.668200 4677 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-67d5b9b9b4-m4tl8_a1e620e5-bd71-4d49-a2d6-a9106f477de2/keystone-api/0.log" line={} Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.013785 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d"] Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.014632 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.021099 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d"] Oct 07 13:23:05 crc kubenswrapper[4677]: E1007 13:23:05.127411 4677 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-67d5b9b9b4-m4tl8_a1e620e5-bd71-4d49-a2d6-a9106f477de2/keystone-api/0.log" line={} Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.138468 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.138569 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.138624 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.138705 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnt42\" (UniqueName: \"kubernetes.io/projected/b12f031a-8152-4087-abf7-0eb4063f6406-kube-api-access-wnt42\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.138772 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.240086 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnt42\" (UniqueName: \"kubernetes.io/projected/b12f031a-8152-4087-abf7-0eb4063f6406-kube-api-access-wnt42\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.240161 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.240206 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.240251 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.240282 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.247753 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.248461 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.249312 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.251522 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.259018 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnt42\" (UniqueName: \"kubernetes.io/projected/b12f031a-8152-4087-abf7-0eb4063f6406-kube-api-access-wnt42\") pod \"keystone-6fdd8d84d5-dmf5d\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.347993 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.807316 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d"] Oct 07 13:23:05 crc kubenswrapper[4677]: W1007 13:23:05.814423 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb12f031a_8152_4087_abf7_0eb4063f6406.slice/crio-6efa117f04bb801ed3adc4585c76d2cbf146b70047ac047b2c6b3ad03365991f WatchSource:0}: Error finding container 6efa117f04bb801ed3adc4585c76d2cbf146b70047ac047b2c6b3ad03365991f: Status 404 returned error can't find the container with id 6efa117f04bb801ed3adc4585c76d2cbf146b70047ac047b2c6b3ad03365991f Oct 07 13:23:05 crc kubenswrapper[4677]: I1007 13:23:05.927540 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" event={"ID":"b12f031a-8152-4087-abf7-0eb4063f6406","Type":"ContainerStarted","Data":"6efa117f04bb801ed3adc4585c76d2cbf146b70047ac047b2c6b3ad03365991f"} Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.490834 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cdp9l"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.499558 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cdp9l"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.503406 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-n4qcj"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.506767 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-n4qcj"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.516827 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.517089 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" podUID="a1e620e5-bd71-4d49-a2d6-a9106f477de2" containerName="keystone-api" containerID="cri-o://9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0" gracePeriod=30 Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.523806 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.550891 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystonee90f-account-delete-v8jr8"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.551841 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.557490 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonee90f-account-delete-v8jr8"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.593788 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-bt8wg"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.603961 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-bt8wg"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.610925 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonee90f-account-delete-v8jr8"] Oct 07 13:23:06 crc kubenswrapper[4677]: E1007 13:23:06.611418 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-x9wxv], unattached volumes=[], failed to process volumes=[]: context canceled" pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" podUID="554d18aa-1d5b-4791-9ab2-45906f4ad356" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.617355 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-e90f-account-create-bh98g"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.624089 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-e90f-account-create-bh98g"] Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.660872 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9wxv\" (UniqueName: \"kubernetes.io/projected/554d18aa-1d5b-4791-9ab2-45906f4ad356-kube-api-access-x9wxv\") pod \"keystonee90f-account-delete-v8jr8\" (UID: \"554d18aa-1d5b-4791-9ab2-45906f4ad356\") " pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.762840 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wxv\" (UniqueName: \"kubernetes.io/projected/554d18aa-1d5b-4791-9ab2-45906f4ad356-kube-api-access-x9wxv\") pod \"keystonee90f-account-delete-v8jr8\" (UID: \"554d18aa-1d5b-4791-9ab2-45906f4ad356\") " pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.792402 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9wxv\" (UniqueName: \"kubernetes.io/projected/554d18aa-1d5b-4791-9ab2-45906f4ad356-kube-api-access-x9wxv\") pod \"keystonee90f-account-delete-v8jr8\" (UID: \"554d18aa-1d5b-4791-9ab2-45906f4ad356\") " pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.937889 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.937910 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" event={"ID":"b12f031a-8152-4087-abf7-0eb4063f6406","Type":"ContainerStarted","Data":"e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708"} Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.938119 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.938303 4677 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" secret="" err="secret \"keystone-keystone-dockercfg-lrcpv\" not found" Oct 07 13:23:06 crc kubenswrapper[4677]: I1007 13:23:06.950991 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.067585 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9wxv\" (UniqueName: \"kubernetes.io/projected/554d18aa-1d5b-4791-9ab2-45906f4ad356-kube-api-access-x9wxv\") pod \"554d18aa-1d5b-4791-9ab2-45906f4ad356\" (UID: \"554d18aa-1d5b-4791-9ab2-45906f4ad356\") " Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068181 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068236 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068266 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068306 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:07.568274667 +0000 UTC m=+959.053983822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone-config-data" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068338 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:07.568324499 +0000 UTC m=+959.054033654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068386 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:07.56835615 +0000 UTC m=+959.054065315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068404 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.068558 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:07.568524895 +0000 UTC m=+959.054234050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone-scripts" not found Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.073857 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/554d18aa-1d5b-4791-9ab2-45906f4ad356-kube-api-access-x9wxv" (OuterVolumeSpecName: "kube-api-access-x9wxv") pod "554d18aa-1d5b-4791-9ab2-45906f4ad356" (UID: "554d18aa-1d5b-4791-9ab2-45906f4ad356"). InnerVolumeSpecName "kube-api-access-x9wxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.169581 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9wxv\" (UniqueName: \"kubernetes.io/projected/554d18aa-1d5b-4791-9ab2-45906f4ad356-kube-api-access-x9wxv\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.313490 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb86344d-27e2-4b68-b0ca-06f6622b6666" path="/var/lib/kubelet/pods/bb86344d-27e2-4b68-b0ca-06f6622b6666/volumes" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.314162 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3722f2-ce90-446b-9310-211efd91af37" path="/var/lib/kubelet/pods/bf3722f2-ce90-446b-9310-211efd91af37/volumes" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.314895 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b1942d-b0d7-4d01-8166-7a78f0908987" path="/var/lib/kubelet/pods/c1b1942d-b0d7-4d01-8166-7a78f0908987/volumes" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.315602 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0671a0f-965b-495f-882a-8b892ca011c4" path="/var/lib/kubelet/pods/f0671a0f-965b-495f-882a-8b892ca011c4/volumes" Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.576831 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.576987 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.576987 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.577034 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.577160 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:08.577143522 +0000 UTC m=+960.062852637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.577286 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:08.577265596 +0000 UTC m=+960.062974731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone-scripts" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.577309 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:08.577295857 +0000 UTC m=+960.063004992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone" not found Oct 07 13:23:07 crc kubenswrapper[4677]: E1007 13:23:07.577346 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:08.577333808 +0000 UTC m=+960.063042943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone-config-data" not found Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.946647 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee90f-account-delete-v8jr8" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.947387 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" podUID="b12f031a-8152-4087-abf7-0eb4063f6406" containerName="keystone-api" containerID="cri-o://e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708" gracePeriod=30 Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.984368 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" podStartSLOduration=3.984341826 podStartE2EDuration="3.984341826s" podCreationTimestamp="2025-10-07 13:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:23:06.962203581 +0000 UTC m=+958.447912736" watchObservedRunningTime="2025-10-07 13:23:07.984341826 +0000 UTC m=+959.470050951" Oct 07 13:23:07 crc kubenswrapper[4677]: I1007 13:23:07.998500 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonee90f-account-delete-v8jr8"] Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.006261 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystonee90f-account-delete-v8jr8"] Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.516034 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592490 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592522 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592594 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:10.592571044 +0000 UTC m=+962.078280179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone-scripts" not found Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592622 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:10.592607865 +0000 UTC m=+962.078316990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone" not found Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592626 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592649 4677 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592776 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:10.592739119 +0000 UTC m=+962.078448284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone-config-data" not found Oct 07 13:23:08 crc kubenswrapper[4677]: E1007 13:23:08.592811 4677 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys podName:b12f031a-8152-4087-abf7-0eb4063f6406 nodeName:}" failed. No retries permitted until 2025-10-07 13:23:10.59279726 +0000 UTC m=+962.078506415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys") pod "keystone-6fdd8d84d5-dmf5d" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406") : secret "keystone" not found Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.693157 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnt42\" (UniqueName: \"kubernetes.io/projected/b12f031a-8152-4087-abf7-0eb4063f6406-kube-api-access-wnt42\") pod \"b12f031a-8152-4087-abf7-0eb4063f6406\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.693242 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data\") pod \"b12f031a-8152-4087-abf7-0eb4063f6406\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.693303 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys\") pod \"b12f031a-8152-4087-abf7-0eb4063f6406\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.693330 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys\") pod \"b12f031a-8152-4087-abf7-0eb4063f6406\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.693377 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts\") pod \"b12f031a-8152-4087-abf7-0eb4063f6406\" (UID: \"b12f031a-8152-4087-abf7-0eb4063f6406\") " Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.698283 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b12f031a-8152-4087-abf7-0eb4063f6406" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.699606 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b12f031a-8152-4087-abf7-0eb4063f6406" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.699822 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12f031a-8152-4087-abf7-0eb4063f6406-kube-api-access-wnt42" (OuterVolumeSpecName: "kube-api-access-wnt42") pod "b12f031a-8152-4087-abf7-0eb4063f6406" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406"). InnerVolumeSpecName "kube-api-access-wnt42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.700660 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts" (OuterVolumeSpecName: "scripts") pod "b12f031a-8152-4087-abf7-0eb4063f6406" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.715870 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data" (OuterVolumeSpecName: "config-data") pod "b12f031a-8152-4087-abf7-0eb4063f6406" (UID: "b12f031a-8152-4087-abf7-0eb4063f6406"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.795651 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnt42\" (UniqueName: \"kubernetes.io/projected/b12f031a-8152-4087-abf7-0eb4063f6406-kube-api-access-wnt42\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.795690 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.795703 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.795715 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.795726 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12f031a-8152-4087-abf7-0eb4063f6406-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.958235 4677 generic.go:334] "Generic (PLEG): container finished" podID="b12f031a-8152-4087-abf7-0eb4063f6406" containerID="e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708" exitCode=0 Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.958305 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.958307 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" event={"ID":"b12f031a-8152-4087-abf7-0eb4063f6406","Type":"ContainerDied","Data":"e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708"} Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.958379 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d" event={"ID":"b12f031a-8152-4087-abf7-0eb4063f6406","Type":"ContainerDied","Data":"6efa117f04bb801ed3adc4585c76d2cbf146b70047ac047b2c6b3ad03365991f"} Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.958414 4677 scope.go:117] "RemoveContainer" containerID="e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708" Oct 07 13:23:08 crc kubenswrapper[4677]: I1007 13:23:08.999152 4677 scope.go:117] "RemoveContainer" containerID="e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708" Oct 07 13:23:09 crc kubenswrapper[4677]: E1007 13:23:09.000081 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708\": container with ID starting with e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708 not found: ID does not exist" containerID="e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.000127 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708"} err="failed to get container status \"e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708\": rpc error: code = NotFound desc = could not find container \"e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708\": container with ID starting with e5c236630f39cac66bb8492ec1b8e6351e6d14c66bfc50f9c39f899139410708 not found: ID does not exist" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.010122 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d"] Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.016972 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-6fdd8d84d5-dmf5d"] Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.330745 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="554d18aa-1d5b-4791-9ab2-45906f4ad356" path="/var/lib/kubelet/pods/554d18aa-1d5b-4791-9ab2-45906f4ad356/volumes" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.333279 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12f031a-8152-4087-abf7-0eb4063f6406" path="/var/lib/kubelet/pods/b12f031a-8152-4087-abf7-0eb4063f6406/volumes" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.954203 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.977890 4677 generic.go:334] "Generic (PLEG): container finished" podID="a1e620e5-bd71-4d49-a2d6-a9106f477de2" containerID="9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0" exitCode=0 Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.978014 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" event={"ID":"a1e620e5-bd71-4d49-a2d6-a9106f477de2","Type":"ContainerDied","Data":"9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0"} Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.978091 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" event={"ID":"a1e620e5-bd71-4d49-a2d6-a9106f477de2","Type":"ContainerDied","Data":"baccb333f269fee501d5249db8b014fdce1ac92bf75aabc9375e1dcf48fb7c96"} Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.978117 4677 scope.go:117] "RemoveContainer" containerID="9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.978160 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.996415 4677 scope.go:117] "RemoveContainer" containerID="9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0" Oct 07 13:23:09 crc kubenswrapper[4677]: E1007 13:23:09.997052 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0\": container with ID starting with 9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0 not found: ID does not exist" containerID="9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0" Oct 07 13:23:09 crc kubenswrapper[4677]: I1007 13:23:09.997090 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0"} err="failed to get container status \"9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0\": rpc error: code = NotFound desc = could not find container \"9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0\": container with ID starting with 9aed236d1700553789b0fd0e96372799e20e302103f224fb4a599715590c88d0 not found: ID does not exist" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.120504 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-credential-keys\") pod \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.120607 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-config-data\") pod \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.120710 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrbb7\" (UniqueName: \"kubernetes.io/projected/a1e620e5-bd71-4d49-a2d6-a9106f477de2-kube-api-access-wrbb7\") pod \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.120755 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-scripts\") pod \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.120862 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-fernet-keys\") pod \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\" (UID: \"a1e620e5-bd71-4d49-a2d6-a9106f477de2\") " Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.126590 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a1e620e5-bd71-4d49-a2d6-a9106f477de2" (UID: "a1e620e5-bd71-4d49-a2d6-a9106f477de2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.126982 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e620e5-bd71-4d49-a2d6-a9106f477de2-kube-api-access-wrbb7" (OuterVolumeSpecName: "kube-api-access-wrbb7") pod "a1e620e5-bd71-4d49-a2d6-a9106f477de2" (UID: "a1e620e5-bd71-4d49-a2d6-a9106f477de2"). InnerVolumeSpecName "kube-api-access-wrbb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.127414 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-scripts" (OuterVolumeSpecName: "scripts") pod "a1e620e5-bd71-4d49-a2d6-a9106f477de2" (UID: "a1e620e5-bd71-4d49-a2d6-a9106f477de2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.127978 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a1e620e5-bd71-4d49-a2d6-a9106f477de2" (UID: "a1e620e5-bd71-4d49-a2d6-a9106f477de2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.154015 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-config-data" (OuterVolumeSpecName: "config-data") pod "a1e620e5-bd71-4d49-a2d6-a9106f477de2" (UID: "a1e620e5-bd71-4d49-a2d6-a9106f477de2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.222835 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.222879 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrbb7\" (UniqueName: \"kubernetes.io/projected/a1e620e5-bd71-4d49-a2d6-a9106f477de2-kube-api-access-wrbb7\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.222889 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.222897 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.222905 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e620e5-bd71-4d49-a2d6-a9106f477de2-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.309719 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8"] Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.314335 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-67d5b9b9b4-m4tl8"] Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.837007 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-ln4h2"] Oct 07 13:23:10 crc kubenswrapper[4677]: E1007 13:23:10.837403 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12f031a-8152-4087-abf7-0eb4063f6406" containerName="keystone-api" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.837424 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12f031a-8152-4087-abf7-0eb4063f6406" containerName="keystone-api" Oct 07 13:23:10 crc kubenswrapper[4677]: E1007 13:23:10.837487 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e620e5-bd71-4d49-a2d6-a9106f477de2" containerName="keystone-api" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.837504 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e620e5-bd71-4d49-a2d6-a9106f477de2" containerName="keystone-api" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.837712 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e620e5-bd71-4d49-a2d6-a9106f477de2" containerName="keystone-api" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.837736 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12f031a-8152-4087-abf7-0eb4063f6406" containerName="keystone-api" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.838418 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-ln4h2" Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.845146 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-ln4h2"] Oct 07 13:23:10 crc kubenswrapper[4677]: I1007 13:23:10.935268 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6tt5\" (UniqueName: \"kubernetes.io/projected/bfb3a5e0-9226-49bc-b044-2996c9496e20-kube-api-access-c6tt5\") pod \"keystone-db-create-ln4h2\" (UID: \"bfb3a5e0-9226-49bc-b044-2996c9496e20\") " pod="keystone-kuttl-tests/keystone-db-create-ln4h2" Oct 07 13:23:11 crc kubenswrapper[4677]: I1007 13:23:11.038647 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6tt5\" (UniqueName: \"kubernetes.io/projected/bfb3a5e0-9226-49bc-b044-2996c9496e20-kube-api-access-c6tt5\") pod \"keystone-db-create-ln4h2\" (UID: \"bfb3a5e0-9226-49bc-b044-2996c9496e20\") " pod="keystone-kuttl-tests/keystone-db-create-ln4h2" Oct 07 13:23:11 crc kubenswrapper[4677]: I1007 13:23:11.072300 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6tt5\" (UniqueName: \"kubernetes.io/projected/bfb3a5e0-9226-49bc-b044-2996c9496e20-kube-api-access-c6tt5\") pod \"keystone-db-create-ln4h2\" (UID: \"bfb3a5e0-9226-49bc-b044-2996c9496e20\") " pod="keystone-kuttl-tests/keystone-db-create-ln4h2" Oct 07 13:23:11 crc kubenswrapper[4677]: I1007 13:23:11.161667 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-ln4h2" Oct 07 13:23:11 crc kubenswrapper[4677]: I1007 13:23:11.314363 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e620e5-bd71-4d49-a2d6-a9106f477de2" path="/var/lib/kubelet/pods/a1e620e5-bd71-4d49-a2d6-a9106f477de2/volumes" Oct 07 13:23:11 crc kubenswrapper[4677]: I1007 13:23:11.407744 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-ln4h2"] Oct 07 13:23:11 crc kubenswrapper[4677]: W1007 13:23:11.412488 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfb3a5e0_9226_49bc_b044_2996c9496e20.slice/crio-4223d2454aa11e37592b01cab41be9074bd85fa54f277231b878796ce495231d WatchSource:0}: Error finding container 4223d2454aa11e37592b01cab41be9074bd85fa54f277231b878796ce495231d: Status 404 returned error can't find the container with id 4223d2454aa11e37592b01cab41be9074bd85fa54f277231b878796ce495231d Oct 07 13:23:12 crc kubenswrapper[4677]: I1007 13:23:12.000628 4677 generic.go:334] "Generic (PLEG): container finished" podID="bfb3a5e0-9226-49bc-b044-2996c9496e20" containerID="0e85b2bd38e151eb42702e923d86e3e5a3f33f1820e387b717d1eda5bdacf085" exitCode=0 Oct 07 13:23:12 crc kubenswrapper[4677]: I1007 13:23:12.000880 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-ln4h2" event={"ID":"bfb3a5e0-9226-49bc-b044-2996c9496e20","Type":"ContainerDied","Data":"0e85b2bd38e151eb42702e923d86e3e5a3f33f1820e387b717d1eda5bdacf085"} Oct 07 13:23:12 crc kubenswrapper[4677]: I1007 13:23:12.001022 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-ln4h2" event={"ID":"bfb3a5e0-9226-49bc-b044-2996c9496e20","Type":"ContainerStarted","Data":"4223d2454aa11e37592b01cab41be9074bd85fa54f277231b878796ce495231d"} Oct 07 13:23:13 crc kubenswrapper[4677]: I1007 13:23:13.286158 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-ln4h2" Oct 07 13:23:13 crc kubenswrapper[4677]: I1007 13:23:13.375359 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6tt5\" (UniqueName: \"kubernetes.io/projected/bfb3a5e0-9226-49bc-b044-2996c9496e20-kube-api-access-c6tt5\") pod \"bfb3a5e0-9226-49bc-b044-2996c9496e20\" (UID: \"bfb3a5e0-9226-49bc-b044-2996c9496e20\") " Oct 07 13:23:13 crc kubenswrapper[4677]: I1007 13:23:13.387108 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb3a5e0-9226-49bc-b044-2996c9496e20-kube-api-access-c6tt5" (OuterVolumeSpecName: "kube-api-access-c6tt5") pod "bfb3a5e0-9226-49bc-b044-2996c9496e20" (UID: "bfb3a5e0-9226-49bc-b044-2996c9496e20"). InnerVolumeSpecName "kube-api-access-c6tt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:13 crc kubenswrapper[4677]: I1007 13:23:13.476905 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6tt5\" (UniqueName: \"kubernetes.io/projected/bfb3a5e0-9226-49bc-b044-2996c9496e20-kube-api-access-c6tt5\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:14 crc kubenswrapper[4677]: I1007 13:23:14.019391 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-ln4h2" event={"ID":"bfb3a5e0-9226-49bc-b044-2996c9496e20","Type":"ContainerDied","Data":"4223d2454aa11e37592b01cab41be9074bd85fa54f277231b878796ce495231d"} Oct 07 13:23:14 crc kubenswrapper[4677]: I1007 13:23:14.019851 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4223d2454aa11e37592b01cab41be9074bd85fa54f277231b878796ce495231d" Oct 07 13:23:14 crc kubenswrapper[4677]: I1007 13:23:14.019524 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-ln4h2" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.699895 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-36ef-account-create-kdh4d"] Oct 07 13:23:21 crc kubenswrapper[4677]: E1007 13:23:21.700881 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb3a5e0-9226-49bc-b044-2996c9496e20" containerName="mariadb-database-create" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.700903 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb3a5e0-9226-49bc-b044-2996c9496e20" containerName="mariadb-database-create" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.701162 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb3a5e0-9226-49bc-b044-2996c9496e20" containerName="mariadb-database-create" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.701863 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.703952 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.708754 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-36ef-account-create-kdh4d"] Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.805751 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5w9v\" (UniqueName: \"kubernetes.io/projected/35ebf301-93d2-4af2-83c0-98b833bce405-kube-api-access-p5w9v\") pod \"keystone-36ef-account-create-kdh4d\" (UID: \"35ebf301-93d2-4af2-83c0-98b833bce405\") " pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.907989 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5w9v\" (UniqueName: \"kubernetes.io/projected/35ebf301-93d2-4af2-83c0-98b833bce405-kube-api-access-p5w9v\") pod \"keystone-36ef-account-create-kdh4d\" (UID: \"35ebf301-93d2-4af2-83c0-98b833bce405\") " pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" Oct 07 13:23:21 crc kubenswrapper[4677]: I1007 13:23:21.943559 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5w9v\" (UniqueName: \"kubernetes.io/projected/35ebf301-93d2-4af2-83c0-98b833bce405-kube-api-access-p5w9v\") pod \"keystone-36ef-account-create-kdh4d\" (UID: \"35ebf301-93d2-4af2-83c0-98b833bce405\") " pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" Oct 07 13:23:22 crc kubenswrapper[4677]: I1007 13:23:22.033822 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" Oct 07 13:23:22 crc kubenswrapper[4677]: I1007 13:23:22.280247 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-36ef-account-create-kdh4d"] Oct 07 13:23:22 crc kubenswrapper[4677]: W1007 13:23:22.290022 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35ebf301_93d2_4af2_83c0_98b833bce405.slice/crio-acc3276a2814be935ce9b20613195d110b138527d68a467b3defe0e142ec5627 WatchSource:0}: Error finding container acc3276a2814be935ce9b20613195d110b138527d68a467b3defe0e142ec5627: Status 404 returned error can't find the container with id acc3276a2814be935ce9b20613195d110b138527d68a467b3defe0e142ec5627 Oct 07 13:23:23 crc kubenswrapper[4677]: I1007 13:23:23.100596 4677 generic.go:334] "Generic (PLEG): container finished" podID="35ebf301-93d2-4af2-83c0-98b833bce405" containerID="8a9ea4fad0a2e53abface3155349591856e21236e9fea2c57369c18036a22bca" exitCode=0 Oct 07 13:23:23 crc kubenswrapper[4677]: I1007 13:23:23.101023 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" event={"ID":"35ebf301-93d2-4af2-83c0-98b833bce405","Type":"ContainerDied","Data":"8a9ea4fad0a2e53abface3155349591856e21236e9fea2c57369c18036a22bca"} Oct 07 13:23:23 crc kubenswrapper[4677]: I1007 13:23:23.101061 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" event={"ID":"35ebf301-93d2-4af2-83c0-98b833bce405","Type":"ContainerStarted","Data":"acc3276a2814be935ce9b20613195d110b138527d68a467b3defe0e142ec5627"} Oct 07 13:23:24 crc kubenswrapper[4677]: I1007 13:23:24.470773 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" Oct 07 13:23:24 crc kubenswrapper[4677]: I1007 13:23:24.547894 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5w9v\" (UniqueName: \"kubernetes.io/projected/35ebf301-93d2-4af2-83c0-98b833bce405-kube-api-access-p5w9v\") pod \"35ebf301-93d2-4af2-83c0-98b833bce405\" (UID: \"35ebf301-93d2-4af2-83c0-98b833bce405\") " Oct 07 13:23:24 crc kubenswrapper[4677]: I1007 13:23:24.552715 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ebf301-93d2-4af2-83c0-98b833bce405-kube-api-access-p5w9v" (OuterVolumeSpecName: "kube-api-access-p5w9v") pod "35ebf301-93d2-4af2-83c0-98b833bce405" (UID: "35ebf301-93d2-4af2-83c0-98b833bce405"). InnerVolumeSpecName "kube-api-access-p5w9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:24 crc kubenswrapper[4677]: I1007 13:23:24.649795 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5w9v\" (UniqueName: \"kubernetes.io/projected/35ebf301-93d2-4af2-83c0-98b833bce405-kube-api-access-p5w9v\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:25 crc kubenswrapper[4677]: I1007 13:23:25.123031 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" event={"ID":"35ebf301-93d2-4af2-83c0-98b833bce405","Type":"ContainerDied","Data":"acc3276a2814be935ce9b20613195d110b138527d68a467b3defe0e142ec5627"} Oct 07 13:23:25 crc kubenswrapper[4677]: I1007 13:23:25.123103 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc3276a2814be935ce9b20613195d110b138527d68a467b3defe0e142ec5627" Oct 07 13:23:25 crc kubenswrapper[4677]: I1007 13:23:25.123115 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-36ef-account-create-kdh4d" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.267958 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d9cq4"] Oct 07 13:23:27 crc kubenswrapper[4677]: E1007 13:23:27.268625 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ebf301-93d2-4af2-83c0-98b833bce405" containerName="mariadb-account-create" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.268646 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ebf301-93d2-4af2-83c0-98b833bce405" containerName="mariadb-account-create" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.268835 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ebf301-93d2-4af2-83c0-98b833bce405" containerName="mariadb-account-create" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.269541 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.272136 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-j6w47" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.272377 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.272408 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.272979 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.290324 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d9cq4"] Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.392241 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hx9f\" (UniqueName: \"kubernetes.io/projected/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-kube-api-access-5hx9f\") pod \"keystone-db-sync-d9cq4\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.392290 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-config-data\") pod \"keystone-db-sync-d9cq4\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.493373 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hx9f\" (UniqueName: \"kubernetes.io/projected/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-kube-api-access-5hx9f\") pod \"keystone-db-sync-d9cq4\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.493540 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-config-data\") pod \"keystone-db-sync-d9cq4\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.498659 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-config-data\") pod \"keystone-db-sync-d9cq4\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.523579 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hx9f\" (UniqueName: \"kubernetes.io/projected/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-kube-api-access-5hx9f\") pod \"keystone-db-sync-d9cq4\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.586252 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:27 crc kubenswrapper[4677]: I1007 13:23:27.839662 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d9cq4"] Oct 07 13:23:27 crc kubenswrapper[4677]: W1007 13:23:27.850198 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd45ad31_2db5_4807_a5b5_8f1e5031dea3.slice/crio-7d53f9c10dd99c429675dfa694ca908ac26cb123597cabc1eb972cbff6c8b4cf WatchSource:0}: Error finding container 7d53f9c10dd99c429675dfa694ca908ac26cb123597cabc1eb972cbff6c8b4cf: Status 404 returned error can't find the container with id 7d53f9c10dd99c429675dfa694ca908ac26cb123597cabc1eb972cbff6c8b4cf Oct 07 13:23:28 crc kubenswrapper[4677]: I1007 13:23:28.151200 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" event={"ID":"cd45ad31-2db5-4807-a5b5-8f1e5031dea3","Type":"ContainerStarted","Data":"2a0c4af1ac9fb3f088677e28d3185c071f724652ef171a1118d32854c56a1145"} Oct 07 13:23:28 crc kubenswrapper[4677]: I1007 13:23:28.151249 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" event={"ID":"cd45ad31-2db5-4807-a5b5-8f1e5031dea3","Type":"ContainerStarted","Data":"7d53f9c10dd99c429675dfa694ca908ac26cb123597cabc1eb972cbff6c8b4cf"} Oct 07 13:23:28 crc kubenswrapper[4677]: I1007 13:23:28.176703 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" podStartSLOduration=1.17667295 podStartE2EDuration="1.17667295s" podCreationTimestamp="2025-10-07 13:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:23:28.170297676 +0000 UTC m=+979.656006881" watchObservedRunningTime="2025-10-07 13:23:28.17667295 +0000 UTC m=+979.662382105" Oct 07 13:23:30 crc kubenswrapper[4677]: I1007 13:23:30.169648 4677 generic.go:334] "Generic (PLEG): container finished" podID="cd45ad31-2db5-4807-a5b5-8f1e5031dea3" containerID="2a0c4af1ac9fb3f088677e28d3185c071f724652ef171a1118d32854c56a1145" exitCode=0 Oct 07 13:23:30 crc kubenswrapper[4677]: I1007 13:23:30.169728 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" event={"ID":"cd45ad31-2db5-4807-a5b5-8f1e5031dea3","Type":"ContainerDied","Data":"2a0c4af1ac9fb3f088677e28d3185c071f724652ef171a1118d32854c56a1145"} Oct 07 13:23:31 crc kubenswrapper[4677]: I1007 13:23:31.485698 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:31 crc kubenswrapper[4677]: I1007 13:23:31.655832 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hx9f\" (UniqueName: \"kubernetes.io/projected/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-kube-api-access-5hx9f\") pod \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " Oct 07 13:23:31 crc kubenswrapper[4677]: I1007 13:23:31.656001 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-config-data\") pod \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\" (UID: \"cd45ad31-2db5-4807-a5b5-8f1e5031dea3\") " Oct 07 13:23:31 crc kubenswrapper[4677]: I1007 13:23:31.662368 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-kube-api-access-5hx9f" (OuterVolumeSpecName: "kube-api-access-5hx9f") pod "cd45ad31-2db5-4807-a5b5-8f1e5031dea3" (UID: "cd45ad31-2db5-4807-a5b5-8f1e5031dea3"). InnerVolumeSpecName "kube-api-access-5hx9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:31 crc kubenswrapper[4677]: I1007 13:23:31.698502 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-config-data" (OuterVolumeSpecName: "config-data") pod "cd45ad31-2db5-4807-a5b5-8f1e5031dea3" (UID: "cd45ad31-2db5-4807-a5b5-8f1e5031dea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:31 crc kubenswrapper[4677]: I1007 13:23:31.757246 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:31 crc kubenswrapper[4677]: I1007 13:23:31.757281 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hx9f\" (UniqueName: \"kubernetes.io/projected/cd45ad31-2db5-4807-a5b5-8f1e5031dea3-kube-api-access-5hx9f\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.191532 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" event={"ID":"cd45ad31-2db5-4807-a5b5-8f1e5031dea3","Type":"ContainerDied","Data":"7d53f9c10dd99c429675dfa694ca908ac26cb123597cabc1eb972cbff6c8b4cf"} Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.191587 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d53f9c10dd99c429675dfa694ca908ac26cb123597cabc1eb972cbff6c8b4cf" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.191620 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d9cq4" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.386621 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-5xk94"] Oct 07 13:23:32 crc kubenswrapper[4677]: E1007 13:23:32.387023 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd45ad31-2db5-4807-a5b5-8f1e5031dea3" containerName="keystone-db-sync" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.387052 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd45ad31-2db5-4807-a5b5-8f1e5031dea3" containerName="keystone-db-sync" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.387303 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd45ad31-2db5-4807-a5b5-8f1e5031dea3" containerName="keystone-db-sync" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.388062 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.393137 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.393566 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.393820 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.396248 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-j6w47" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.403055 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-5xk94"] Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.467701 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-config-data\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.468044 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-credential-keys\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.468176 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjsj\" (UniqueName: \"kubernetes.io/projected/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-kube-api-access-xpjsj\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.468296 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-fernet-keys\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.468544 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-scripts\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.569283 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-config-data\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.569335 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-credential-keys\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.569371 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjsj\" (UniqueName: \"kubernetes.io/projected/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-kube-api-access-xpjsj\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.569390 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-fernet-keys\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.569426 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-scripts\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.575567 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-fernet-keys\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.577630 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-scripts\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.588161 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-credential-keys\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.588534 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-config-data\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.600571 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjsj\" (UniqueName: \"kubernetes.io/projected/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-kube-api-access-xpjsj\") pod \"keystone-bootstrap-5xk94\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:32 crc kubenswrapper[4677]: I1007 13:23:32.707445 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:33 crc kubenswrapper[4677]: I1007 13:23:33.218710 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-5xk94"] Oct 07 13:23:33 crc kubenswrapper[4677]: W1007 13:23:33.219778 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f5ab40d_dd7b_4e7d_bfbb_b36818073c82.slice/crio-1cdffbf16dd716d153da22e70e1348e00cac0ef470d680364a324016b5a7c006 WatchSource:0}: Error finding container 1cdffbf16dd716d153da22e70e1348e00cac0ef470d680364a324016b5a7c006: Status 404 returned error can't find the container with id 1cdffbf16dd716d153da22e70e1348e00cac0ef470d680364a324016b5a7c006 Oct 07 13:23:34 crc kubenswrapper[4677]: I1007 13:23:34.209771 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" event={"ID":"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82","Type":"ContainerStarted","Data":"4cbee337be6cd9c277d978021e0d03671aaa66539756468a77d43a7692557f7d"} Oct 07 13:23:34 crc kubenswrapper[4677]: I1007 13:23:34.210070 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" event={"ID":"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82","Type":"ContainerStarted","Data":"1cdffbf16dd716d153da22e70e1348e00cac0ef470d680364a324016b5a7c006"} Oct 07 13:23:34 crc kubenswrapper[4677]: I1007 13:23:34.249880 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" podStartSLOduration=2.249852956 podStartE2EDuration="2.249852956s" podCreationTimestamp="2025-10-07 13:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:23:34.231631722 +0000 UTC m=+985.717340857" watchObservedRunningTime="2025-10-07 13:23:34.249852956 +0000 UTC m=+985.735562081" Oct 07 13:23:36 crc kubenswrapper[4677]: I1007 13:23:36.226221 4677 generic.go:334] "Generic (PLEG): container finished" podID="8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" containerID="4cbee337be6cd9c277d978021e0d03671aaa66539756468a77d43a7692557f7d" exitCode=0 Oct 07 13:23:36 crc kubenswrapper[4677]: I1007 13:23:36.226280 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" event={"ID":"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82","Type":"ContainerDied","Data":"4cbee337be6cd9c277d978021e0d03671aaa66539756468a77d43a7692557f7d"} Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.579160 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.749859 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpjsj\" (UniqueName: \"kubernetes.io/projected/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-kube-api-access-xpjsj\") pod \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.749998 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-credential-keys\") pod \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.750061 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-config-data\") pod \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.750190 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-fernet-keys\") pod \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.751081 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-scripts\") pod \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\" (UID: \"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82\") " Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.759023 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" (UID: "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.759098 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" (UID: "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.761505 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-scripts" (OuterVolumeSpecName: "scripts") pod "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" (UID: "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.761907 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-kube-api-access-xpjsj" (OuterVolumeSpecName: "kube-api-access-xpjsj") pod "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" (UID: "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82"). InnerVolumeSpecName "kube-api-access-xpjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.791520 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-config-data" (OuterVolumeSpecName: "config-data") pod "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" (UID: "8f5ab40d-dd7b-4e7d-bfbb-b36818073c82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.855933 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.855983 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.856001 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.856023 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:37 crc kubenswrapper[4677]: I1007 13:23:37.856045 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpjsj\" (UniqueName: \"kubernetes.io/projected/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82-kube-api-access-xpjsj\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.244626 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" event={"ID":"8f5ab40d-dd7b-4e7d-bfbb-b36818073c82","Type":"ContainerDied","Data":"1cdffbf16dd716d153da22e70e1348e00cac0ef470d680364a324016b5a7c006"} Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.244915 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cdffbf16dd716d153da22e70e1348e00cac0ef470d680364a324016b5a7c006" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.244707 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-5xk94" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.318114 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp"] Oct 07 13:23:38 crc kubenswrapper[4677]: E1007 13:23:38.318380 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" containerName="keystone-bootstrap" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.318397 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" containerName="keystone-bootstrap" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.318582 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" containerName="keystone-bootstrap" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.319089 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.323819 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-j6w47" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.324186 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.324599 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.325649 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.340316 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp"] Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.465292 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-credential-keys\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.465401 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-config-data\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.465604 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-scripts\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.465674 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-fernet-keys\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.465710 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnt4f\" (UniqueName: \"kubernetes.io/projected/071dfc6a-adea-40fb-9011-e4c04166b624-kube-api-access-vnt4f\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.567272 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-scripts\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.567347 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-fernet-keys\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.567374 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnt4f\" (UniqueName: \"kubernetes.io/projected/071dfc6a-adea-40fb-9011-e4c04166b624-kube-api-access-vnt4f\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.567424 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-credential-keys\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.567473 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-config-data\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.571903 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-fernet-keys\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.577084 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-credential-keys\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.577347 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-scripts\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.577462 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-config-data\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.596720 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnt4f\" (UniqueName: \"kubernetes.io/projected/071dfc6a-adea-40fb-9011-e4c04166b624-kube-api-access-vnt4f\") pod \"keystone-54f4dd5dd6-g8bmp\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.634653 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:38 crc kubenswrapper[4677]: I1007 13:23:38.839527 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp"] Oct 07 13:23:39 crc kubenswrapper[4677]: I1007 13:23:39.251506 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" event={"ID":"071dfc6a-adea-40fb-9011-e4c04166b624","Type":"ContainerStarted","Data":"c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1"} Oct 07 13:23:39 crc kubenswrapper[4677]: I1007 13:23:39.251595 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" event={"ID":"071dfc6a-adea-40fb-9011-e4c04166b624","Type":"ContainerStarted","Data":"e8ca8b4ef092e4fdb54ebb198c1d95b46782421a1c3ad5f45ea70637de3eaa27"} Oct 07 13:23:39 crc kubenswrapper[4677]: I1007 13:23:39.251693 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:23:39 crc kubenswrapper[4677]: I1007 13:23:39.279578 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" podStartSLOduration=1.279553267 podStartE2EDuration="1.279553267s" podCreationTimestamp="2025-10-07 13:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:23:39.272478773 +0000 UTC m=+990.758187928" watchObservedRunningTime="2025-10-07 13:23:39.279553267 +0000 UTC m=+990.765262442" Oct 07 13:23:40 crc kubenswrapper[4677]: I1007 13:23:40.917146 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:23:40 crc kubenswrapper[4677]: I1007 13:23:40.917396 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:24:10 crc kubenswrapper[4677]: I1007 13:24:10.061277 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:24:10 crc kubenswrapper[4677]: I1007 13:24:10.917242 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:24:10 crc kubenswrapper[4677]: I1007 13:24:10.917318 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.128271 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m"] Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.129255 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.134758 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6"] Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.135520 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.142508 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m"] Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.149909 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6"] Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.234966 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-scripts\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235007 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-scripts\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235084 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-credential-keys\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235113 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-fernet-keys\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235134 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcc67\" (UniqueName: \"kubernetes.io/projected/703f8326-3055-4bc7-bfbf-6d5c87582768-kube-api-access-zcc67\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235162 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-config-data\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235177 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjm2r\" (UniqueName: \"kubernetes.io/projected/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-kube-api-access-wjm2r\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235192 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-fernet-keys\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235207 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-config-data\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.235233 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-credential-keys\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336183 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-scripts\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336230 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-scripts\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336260 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-credential-keys\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336286 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-fernet-keys\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336303 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcc67\" (UniqueName: \"kubernetes.io/projected/703f8326-3055-4bc7-bfbf-6d5c87582768-kube-api-access-zcc67\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336340 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-config-data\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336358 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjm2r\" (UniqueName: \"kubernetes.io/projected/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-kube-api-access-wjm2r\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336376 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-fernet-keys\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336403 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-config-data\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.336463 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-credential-keys\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.342861 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-scripts\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.342871 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-scripts\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.343251 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-fernet-keys\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.343518 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-config-data\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.344305 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-credential-keys\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.345767 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-fernet-keys\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.346971 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-credential-keys\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.351382 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-config-data\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.353416 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjm2r\" (UniqueName: \"kubernetes.io/projected/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-kube-api-access-wjm2r\") pod \"keystone-54f4dd5dd6-pmq2m\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.355631 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcc67\" (UniqueName: \"kubernetes.io/projected/703f8326-3055-4bc7-bfbf-6d5c87582768-kube-api-access-zcc67\") pod \"keystone-54f4dd5dd6-f4sk6\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.450931 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.458632 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.864696 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m"] Oct 07 13:24:11 crc kubenswrapper[4677]: I1007 13:24:11.924723 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6"] Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.573617 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" event={"ID":"5d2b9453-afa4-49a2-9fa5-e2041579a7b3","Type":"ContainerStarted","Data":"cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d"} Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.573985 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" event={"ID":"5d2b9453-afa4-49a2-9fa5-e2041579a7b3","Type":"ContainerStarted","Data":"dfd869d358688cc2277db1109458795a898cb2646816c8df1187b1a17d0642a4"} Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.574014 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.575904 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" event={"ID":"703f8326-3055-4bc7-bfbf-6d5c87582768","Type":"ContainerStarted","Data":"f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735"} Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.575984 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" event={"ID":"703f8326-3055-4bc7-bfbf-6d5c87582768","Type":"ContainerStarted","Data":"e0d61019c22cffaa270b55946d8cdc23ce74825d6dcda322b323246a88b08c65"} Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.576153 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.598996 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" podStartSLOduration=1.598887517 podStartE2EDuration="1.598887517s" podCreationTimestamp="2025-10-07 13:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:24:12.595058207 +0000 UTC m=+1024.080767372" watchObservedRunningTime="2025-10-07 13:24:12.598887517 +0000 UTC m=+1024.084596712" Oct 07 13:24:12 crc kubenswrapper[4677]: I1007 13:24:12.620199 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" podStartSLOduration=1.620169149 podStartE2EDuration="1.620169149s" podCreationTimestamp="2025-10-07 13:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:24:12.61358346 +0000 UTC m=+1024.099292665" watchObservedRunningTime="2025-10-07 13:24:12.620169149 +0000 UTC m=+1024.105878324" Oct 07 13:24:40 crc kubenswrapper[4677]: I1007 13:24:40.918252 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:24:40 crc kubenswrapper[4677]: I1007 13:24:40.919043 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:24:40 crc kubenswrapper[4677]: I1007 13:24:40.919120 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:24:40 crc kubenswrapper[4677]: I1007 13:24:40.920000 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37fd49a51a3bd5137d45d074e28b4ab8e0800f2fea4c41dcc155e9985c92e63a"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:24:40 crc kubenswrapper[4677]: I1007 13:24:40.920099 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://37fd49a51a3bd5137d45d074e28b4ab8e0800f2fea4c41dcc155e9985c92e63a" gracePeriod=600 Oct 07 13:24:41 crc kubenswrapper[4677]: I1007 13:24:41.790276 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="37fd49a51a3bd5137d45d074e28b4ab8e0800f2fea4c41dcc155e9985c92e63a" exitCode=0 Oct 07 13:24:41 crc kubenswrapper[4677]: I1007 13:24:41.790355 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"37fd49a51a3bd5137d45d074e28b4ab8e0800f2fea4c41dcc155e9985c92e63a"} Oct 07 13:24:41 crc kubenswrapper[4677]: I1007 13:24:41.790553 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"31488fc6d756c96c0843de0c3c1b89a15439285821997035e8781aaf44c08c84"} Oct 07 13:24:41 crc kubenswrapper[4677]: I1007 13:24:41.790570 4677 scope.go:117] "RemoveContainer" containerID="ec42d23040a8452012f48d89f3054555831bcfb79cefb8a91a87385178f388c8" Oct 07 13:24:42 crc kubenswrapper[4677]: I1007 13:24:42.829451 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:42 crc kubenswrapper[4677]: I1007 13:24:42.933623 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:44 crc kubenswrapper[4677]: I1007 13:24:44.061853 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m"] Oct 07 13:24:44 crc kubenswrapper[4677]: I1007 13:24:44.062103 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" podUID="5d2b9453-afa4-49a2-9fa5-e2041579a7b3" containerName="keystone-api" containerID="cri-o://cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d" gracePeriod=30 Oct 07 13:24:44 crc kubenswrapper[4677]: I1007 13:24:44.082626 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6"] Oct 07 13:24:44 crc kubenswrapper[4677]: I1007 13:24:44.083208 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" podUID="703f8326-3055-4bc7-bfbf-6d5c87582768" containerName="keystone-api" containerID="cri-o://f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735" gracePeriod=30 Oct 07 13:24:45 crc kubenswrapper[4677]: I1007 13:24:45.294153 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp"] Oct 07 13:24:45 crc kubenswrapper[4677]: I1007 13:24:45.294537 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" podUID="071dfc6a-adea-40fb-9011-e4c04166b624" containerName="keystone-api" containerID="cri-o://c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1" gracePeriod=30 Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.561180 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.589007 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679165 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-fernet-keys\") pod \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679218 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-credential-keys\") pod \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679281 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-scripts\") pod \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679311 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcc67\" (UniqueName: \"kubernetes.io/projected/703f8326-3055-4bc7-bfbf-6d5c87582768-kube-api-access-zcc67\") pod \"703f8326-3055-4bc7-bfbf-6d5c87582768\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679347 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjm2r\" (UniqueName: \"kubernetes.io/projected/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-kube-api-access-wjm2r\") pod \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679373 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-credential-keys\") pod \"703f8326-3055-4bc7-bfbf-6d5c87582768\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679394 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-fernet-keys\") pod \"703f8326-3055-4bc7-bfbf-6d5c87582768\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679445 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-config-data\") pod \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\" (UID: \"5d2b9453-afa4-49a2-9fa5-e2041579a7b3\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679476 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-config-data\") pod \"703f8326-3055-4bc7-bfbf-6d5c87582768\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.679517 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-scripts\") pod \"703f8326-3055-4bc7-bfbf-6d5c87582768\" (UID: \"703f8326-3055-4bc7-bfbf-6d5c87582768\") " Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.686405 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-scripts" (OuterVolumeSpecName: "scripts") pod "703f8326-3055-4bc7-bfbf-6d5c87582768" (UID: "703f8326-3055-4bc7-bfbf-6d5c87582768"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.686397 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-kube-api-access-wjm2r" (OuterVolumeSpecName: "kube-api-access-wjm2r") pod "5d2b9453-afa4-49a2-9fa5-e2041579a7b3" (UID: "5d2b9453-afa4-49a2-9fa5-e2041579a7b3"). InnerVolumeSpecName "kube-api-access-wjm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.686419 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "703f8326-3055-4bc7-bfbf-6d5c87582768" (UID: "703f8326-3055-4bc7-bfbf-6d5c87582768"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.687708 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703f8326-3055-4bc7-bfbf-6d5c87582768-kube-api-access-zcc67" (OuterVolumeSpecName: "kube-api-access-zcc67") pod "703f8326-3055-4bc7-bfbf-6d5c87582768" (UID: "703f8326-3055-4bc7-bfbf-6d5c87582768"). InnerVolumeSpecName "kube-api-access-zcc67". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.689545 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-scripts" (OuterVolumeSpecName: "scripts") pod "5d2b9453-afa4-49a2-9fa5-e2041579a7b3" (UID: "5d2b9453-afa4-49a2-9fa5-e2041579a7b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.689552 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "703f8326-3055-4bc7-bfbf-6d5c87582768" (UID: "703f8326-3055-4bc7-bfbf-6d5c87582768"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.689578 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5d2b9453-afa4-49a2-9fa5-e2041579a7b3" (UID: "5d2b9453-afa4-49a2-9fa5-e2041579a7b3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.689797 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5d2b9453-afa4-49a2-9fa5-e2041579a7b3" (UID: "5d2b9453-afa4-49a2-9fa5-e2041579a7b3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.703592 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-config-data" (OuterVolumeSpecName: "config-data") pod "5d2b9453-afa4-49a2-9fa5-e2041579a7b3" (UID: "5d2b9453-afa4-49a2-9fa5-e2041579a7b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.703652 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-config-data" (OuterVolumeSpecName: "config-data") pod "703f8326-3055-4bc7-bfbf-6d5c87582768" (UID: "703f8326-3055-4bc7-bfbf-6d5c87582768"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781556 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781597 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781612 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781624 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcc67\" (UniqueName: \"kubernetes.io/projected/703f8326-3055-4bc7-bfbf-6d5c87582768-kube-api-access-zcc67\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781639 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjm2r\" (UniqueName: \"kubernetes.io/projected/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-kube-api-access-wjm2r\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781650 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781661 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781672 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2b9453-afa4-49a2-9fa5-e2041579a7b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781683 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.781696 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703f8326-3055-4bc7-bfbf-6d5c87582768-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.858331 4677 generic.go:334] "Generic (PLEG): container finished" podID="703f8326-3055-4bc7-bfbf-6d5c87582768" containerID="f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735" exitCode=0 Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.858375 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" event={"ID":"703f8326-3055-4bc7-bfbf-6d5c87582768","Type":"ContainerDied","Data":"f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735"} Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.858782 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" event={"ID":"703f8326-3055-4bc7-bfbf-6d5c87582768","Type":"ContainerDied","Data":"e0d61019c22cffaa270b55946d8cdc23ce74825d6dcda322b323246a88b08c65"} Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.858818 4677 scope.go:117] "RemoveContainer" containerID="f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.858423 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.861357 4677 generic.go:334] "Generic (PLEG): container finished" podID="5d2b9453-afa4-49a2-9fa5-e2041579a7b3" containerID="cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d" exitCode=0 Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.861411 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" event={"ID":"5d2b9453-afa4-49a2-9fa5-e2041579a7b3","Type":"ContainerDied","Data":"cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d"} Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.861474 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.861509 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m" event={"ID":"5d2b9453-afa4-49a2-9fa5-e2041579a7b3","Type":"ContainerDied","Data":"dfd869d358688cc2277db1109458795a898cb2646816c8df1187b1a17d0642a4"} Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.897032 4677 scope.go:117] "RemoveContainer" containerID="f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735" Oct 07 13:24:47 crc kubenswrapper[4677]: E1007 13:24:47.897662 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735\": container with ID starting with f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735 not found: ID does not exist" containerID="f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.897697 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735"} err="failed to get container status \"f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735\": rpc error: code = NotFound desc = could not find container \"f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735\": container with ID starting with f55dd7328f4b1b8f1b1468dad60bfc8d413d7ac13becad4c162c9859335b3735 not found: ID does not exist" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.897720 4677 scope.go:117] "RemoveContainer" containerID="cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.923565 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6"] Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.931025 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-f4sk6"] Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.935268 4677 scope.go:117] "RemoveContainer" containerID="cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d" Oct 07 13:24:47 crc kubenswrapper[4677]: E1007 13:24:47.935897 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d\": container with ID starting with cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d not found: ID does not exist" containerID="cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.935937 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d"} err="failed to get container status \"cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d\": rpc error: code = NotFound desc = could not find container \"cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d\": container with ID starting with cb95d317abdb71922a21a4772631fa32867572fe1af0ca20369cd1d0fff18e4d not found: ID does not exist" Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.936470 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m"] Oct 07 13:24:47 crc kubenswrapper[4677]: I1007 13:24:47.941510 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-pmq2m"] Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.810289 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.870077 4677 generic.go:334] "Generic (PLEG): container finished" podID="071dfc6a-adea-40fb-9011-e4c04166b624" containerID="c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1" exitCode=0 Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.870209 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" event={"ID":"071dfc6a-adea-40fb-9011-e4c04166b624","Type":"ContainerDied","Data":"c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1"} Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.870267 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" event={"ID":"071dfc6a-adea-40fb-9011-e4c04166b624","Type":"ContainerDied","Data":"e8ca8b4ef092e4fdb54ebb198c1d95b46782421a1c3ad5f45ea70637de3eaa27"} Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.870265 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.870293 4677 scope.go:117] "RemoveContainer" containerID="c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.891251 4677 scope.go:117] "RemoveContainer" containerID="c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1" Oct 07 13:24:48 crc kubenswrapper[4677]: E1007 13:24:48.891831 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1\": container with ID starting with c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1 not found: ID does not exist" containerID="c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.891888 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1"} err="failed to get container status \"c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1\": rpc error: code = NotFound desc = could not find container \"c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1\": container with ID starting with c9766313417ed6c63c6491582618e11f2c623162bfe1ce8afb40748581d7cbb1 not found: ID does not exist" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.898349 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-credential-keys\") pod \"071dfc6a-adea-40fb-9011-e4c04166b624\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.898407 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-scripts\") pod \"071dfc6a-adea-40fb-9011-e4c04166b624\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.898481 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-config-data\") pod \"071dfc6a-adea-40fb-9011-e4c04166b624\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.898543 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnt4f\" (UniqueName: \"kubernetes.io/projected/071dfc6a-adea-40fb-9011-e4c04166b624-kube-api-access-vnt4f\") pod \"071dfc6a-adea-40fb-9011-e4c04166b624\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.898625 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-fernet-keys\") pod \"071dfc6a-adea-40fb-9011-e4c04166b624\" (UID: \"071dfc6a-adea-40fb-9011-e4c04166b624\") " Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.902814 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "071dfc6a-adea-40fb-9011-e4c04166b624" (UID: "071dfc6a-adea-40fb-9011-e4c04166b624"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.902978 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "071dfc6a-adea-40fb-9011-e4c04166b624" (UID: "071dfc6a-adea-40fb-9011-e4c04166b624"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.903017 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071dfc6a-adea-40fb-9011-e4c04166b624-kube-api-access-vnt4f" (OuterVolumeSpecName: "kube-api-access-vnt4f") pod "071dfc6a-adea-40fb-9011-e4c04166b624" (UID: "071dfc6a-adea-40fb-9011-e4c04166b624"). InnerVolumeSpecName "kube-api-access-vnt4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.904319 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-scripts" (OuterVolumeSpecName: "scripts") pod "071dfc6a-adea-40fb-9011-e4c04166b624" (UID: "071dfc6a-adea-40fb-9011-e4c04166b624"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:48 crc kubenswrapper[4677]: I1007 13:24:48.914366 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-config-data" (OuterVolumeSpecName: "config-data") pod "071dfc6a-adea-40fb-9011-e4c04166b624" (UID: "071dfc6a-adea-40fb-9011-e4c04166b624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.000064 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.000101 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.000110 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.000118 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnt4f\" (UniqueName: \"kubernetes.io/projected/071dfc6a-adea-40fb-9011-e4c04166b624-kube-api-access-vnt4f\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.000130 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/071dfc6a-adea-40fb-9011-e4c04166b624-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.226517 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp"] Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.235638 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-54f4dd5dd6-g8bmp"] Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.316003 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071dfc6a-adea-40fb-9011-e4c04166b624" path="/var/lib/kubelet/pods/071dfc6a-adea-40fb-9011-e4c04166b624/volumes" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.316651 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d2b9453-afa4-49a2-9fa5-e2041579a7b3" path="/var/lib/kubelet/pods/5d2b9453-afa4-49a2-9fa5-e2041579a7b3/volumes" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.317255 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703f8326-3055-4bc7-bfbf-6d5c87582768" path="/var/lib/kubelet/pods/703f8326-3055-4bc7-bfbf-6d5c87582768/volumes" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.506482 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d9cq4"] Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.525725 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d9cq4"] Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.534758 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-5xk94"] Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.541468 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-5xk94"] Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.546938 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone36ef-account-delete-rxff9"] Oct 07 13:24:49 crc kubenswrapper[4677]: E1007 13:24:49.547247 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d2b9453-afa4-49a2-9fa5-e2041579a7b3" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.547262 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2b9453-afa4-49a2-9fa5-e2041579a7b3" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: E1007 13:24:49.547282 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071dfc6a-adea-40fb-9011-e4c04166b624" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.547291 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="071dfc6a-adea-40fb-9011-e4c04166b624" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: E1007 13:24:49.547300 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703f8326-3055-4bc7-bfbf-6d5c87582768" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.547306 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="703f8326-3055-4bc7-bfbf-6d5c87582768" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.547480 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d2b9453-afa4-49a2-9fa5-e2041579a7b3" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.547493 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="703f8326-3055-4bc7-bfbf-6d5c87582768" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.547503 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="071dfc6a-adea-40fb-9011-e4c04166b624" containerName="keystone-api" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.547920 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.552330 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone36ef-account-delete-rxff9"] Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.711138 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lksvj\" (UniqueName: \"kubernetes.io/projected/e5c00fa5-03f7-428d-bc23-8f6011e32be8-kube-api-access-lksvj\") pod \"keystone36ef-account-delete-rxff9\" (UID: \"e5c00fa5-03f7-428d-bc23-8f6011e32be8\") " pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.812676 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lksvj\" (UniqueName: \"kubernetes.io/projected/e5c00fa5-03f7-428d-bc23-8f6011e32be8-kube-api-access-lksvj\") pod \"keystone36ef-account-delete-rxff9\" (UID: \"e5c00fa5-03f7-428d-bc23-8f6011e32be8\") " pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.834035 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lksvj\" (UniqueName: \"kubernetes.io/projected/e5c00fa5-03f7-428d-bc23-8f6011e32be8-kube-api-access-lksvj\") pod \"keystone36ef-account-delete-rxff9\" (UID: \"e5c00fa5-03f7-428d-bc23-8f6011e32be8\") " pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" Oct 07 13:24:49 crc kubenswrapper[4677]: I1007 13:24:49.865743 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" Oct 07 13:24:50 crc kubenswrapper[4677]: I1007 13:24:50.315000 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone36ef-account-delete-rxff9"] Oct 07 13:24:50 crc kubenswrapper[4677]: I1007 13:24:50.893689 4677 generic.go:334] "Generic (PLEG): container finished" podID="e5c00fa5-03f7-428d-bc23-8f6011e32be8" containerID="65718cb46256c82b8f0772670966bf56a32bb58ee7f882ac94471e9c23a98984" exitCode=0 Oct 07 13:24:50 crc kubenswrapper[4677]: I1007 13:24:50.893773 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" event={"ID":"e5c00fa5-03f7-428d-bc23-8f6011e32be8","Type":"ContainerDied","Data":"65718cb46256c82b8f0772670966bf56a32bb58ee7f882ac94471e9c23a98984"} Oct 07 13:24:50 crc kubenswrapper[4677]: I1007 13:24:50.893961 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" event={"ID":"e5c00fa5-03f7-428d-bc23-8f6011e32be8","Type":"ContainerStarted","Data":"c923003b2c0444bba920d7a84e58b1c9728dde6e44bc4d947522860c56405e8f"} Oct 07 13:24:51 crc kubenswrapper[4677]: I1007 13:24:51.312608 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5ab40d-dd7b-4e7d-bfbb-b36818073c82" path="/var/lib/kubelet/pods/8f5ab40d-dd7b-4e7d-bfbb-b36818073c82/volumes" Oct 07 13:24:51 crc kubenswrapper[4677]: I1007 13:24:51.313531 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd45ad31-2db5-4807-a5b5-8f1e5031dea3" path="/var/lib/kubelet/pods/cd45ad31-2db5-4807-a5b5-8f1e5031dea3/volumes" Oct 07 13:24:52 crc kubenswrapper[4677]: I1007 13:24:52.219904 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" Oct 07 13:24:52 crc kubenswrapper[4677]: I1007 13:24:52.352237 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lksvj\" (UniqueName: \"kubernetes.io/projected/e5c00fa5-03f7-428d-bc23-8f6011e32be8-kube-api-access-lksvj\") pod \"e5c00fa5-03f7-428d-bc23-8f6011e32be8\" (UID: \"e5c00fa5-03f7-428d-bc23-8f6011e32be8\") " Oct 07 13:24:52 crc kubenswrapper[4677]: I1007 13:24:52.357237 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c00fa5-03f7-428d-bc23-8f6011e32be8-kube-api-access-lksvj" (OuterVolumeSpecName: "kube-api-access-lksvj") pod "e5c00fa5-03f7-428d-bc23-8f6011e32be8" (UID: "e5c00fa5-03f7-428d-bc23-8f6011e32be8"). InnerVolumeSpecName "kube-api-access-lksvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:24:52 crc kubenswrapper[4677]: I1007 13:24:52.454258 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lksvj\" (UniqueName: \"kubernetes.io/projected/e5c00fa5-03f7-428d-bc23-8f6011e32be8-kube-api-access-lksvj\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:52 crc kubenswrapper[4677]: I1007 13:24:52.912222 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" event={"ID":"e5c00fa5-03f7-428d-bc23-8f6011e32be8","Type":"ContainerDied","Data":"c923003b2c0444bba920d7a84e58b1c9728dde6e44bc4d947522860c56405e8f"} Oct 07 13:24:52 crc kubenswrapper[4677]: I1007 13:24:52.912507 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c923003b2c0444bba920d7a84e58b1c9728dde6e44bc4d947522860c56405e8f" Oct 07 13:24:52 crc kubenswrapper[4677]: I1007 13:24:52.912334 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone36ef-account-delete-rxff9" Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.562638 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-ln4h2"] Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.575850 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-ln4h2"] Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.585938 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone36ef-account-delete-rxff9"] Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.591775 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-36ef-account-create-kdh4d"] Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.596840 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-36ef-account-create-kdh4d"] Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.601742 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone36ef-account-delete-rxff9"] Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.740239 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qljk2"] Oct 07 13:24:54 crc kubenswrapper[4677]: E1007 13:24:54.740733 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c00fa5-03f7-428d-bc23-8f6011e32be8" containerName="mariadb-account-delete" Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.740769 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c00fa5-03f7-428d-bc23-8f6011e32be8" containerName="mariadb-account-delete" Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.741042 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c00fa5-03f7-428d-bc23-8f6011e32be8" containerName="mariadb-account-delete" Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.741837 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qljk2" Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.753836 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qljk2"] Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.890186 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwrs\" (UniqueName: \"kubernetes.io/projected/49ab45b0-1e6e-49af-9aff-3c9716562dc8-kube-api-access-7rwrs\") pod \"keystone-db-create-qljk2\" (UID: \"49ab45b0-1e6e-49af-9aff-3c9716562dc8\") " pod="keystone-kuttl-tests/keystone-db-create-qljk2" Oct 07 13:24:54 crc kubenswrapper[4677]: I1007 13:24:54.991890 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwrs\" (UniqueName: \"kubernetes.io/projected/49ab45b0-1e6e-49af-9aff-3c9716562dc8-kube-api-access-7rwrs\") pod \"keystone-db-create-qljk2\" (UID: \"49ab45b0-1e6e-49af-9aff-3c9716562dc8\") " pod="keystone-kuttl-tests/keystone-db-create-qljk2" Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.015999 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwrs\" (UniqueName: \"kubernetes.io/projected/49ab45b0-1e6e-49af-9aff-3c9716562dc8-kube-api-access-7rwrs\") pod \"keystone-db-create-qljk2\" (UID: \"49ab45b0-1e6e-49af-9aff-3c9716562dc8\") " pod="keystone-kuttl-tests/keystone-db-create-qljk2" Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.074879 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qljk2" Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.310621 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ebf301-93d2-4af2-83c0-98b833bce405" path="/var/lib/kubelet/pods/35ebf301-93d2-4af2-83c0-98b833bce405/volumes" Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.311329 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb3a5e0-9226-49bc-b044-2996c9496e20" path="/var/lib/kubelet/pods/bfb3a5e0-9226-49bc-b044-2996c9496e20/volumes" Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.311799 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c00fa5-03f7-428d-bc23-8f6011e32be8" path="/var/lib/kubelet/pods/e5c00fa5-03f7-428d-bc23-8f6011e32be8/volumes" Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.500728 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qljk2"] Oct 07 13:24:55 crc kubenswrapper[4677]: W1007 13:24:55.506410 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49ab45b0_1e6e_49af_9aff_3c9716562dc8.slice/crio-a9f99e2c96268291a827ba334da846c8a9fd8607920300ece5f55bbabe547dbd WatchSource:0}: Error finding container a9f99e2c96268291a827ba334da846c8a9fd8607920300ece5f55bbabe547dbd: Status 404 returned error can't find the container with id a9f99e2c96268291a827ba334da846c8a9fd8607920300ece5f55bbabe547dbd Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.937864 4677 generic.go:334] "Generic (PLEG): container finished" podID="49ab45b0-1e6e-49af-9aff-3c9716562dc8" containerID="3f929f62fdb7df361d82c388ebb48844dcac0c50fb2ad8b525a6f6e0fe940e49" exitCode=0 Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.937963 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qljk2" event={"ID":"49ab45b0-1e6e-49af-9aff-3c9716562dc8","Type":"ContainerDied","Data":"3f929f62fdb7df361d82c388ebb48844dcac0c50fb2ad8b525a6f6e0fe940e49"} Oct 07 13:24:55 crc kubenswrapper[4677]: I1007 13:24:55.938218 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qljk2" event={"ID":"49ab45b0-1e6e-49af-9aff-3c9716562dc8","Type":"ContainerStarted","Data":"a9f99e2c96268291a827ba334da846c8a9fd8607920300ece5f55bbabe547dbd"} Oct 07 13:24:57 crc kubenswrapper[4677]: I1007 13:24:57.247686 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qljk2" Oct 07 13:24:57 crc kubenswrapper[4677]: I1007 13:24:57.427076 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rwrs\" (UniqueName: \"kubernetes.io/projected/49ab45b0-1e6e-49af-9aff-3c9716562dc8-kube-api-access-7rwrs\") pod \"49ab45b0-1e6e-49af-9aff-3c9716562dc8\" (UID: \"49ab45b0-1e6e-49af-9aff-3c9716562dc8\") " Oct 07 13:24:57 crc kubenswrapper[4677]: I1007 13:24:57.434524 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ab45b0-1e6e-49af-9aff-3c9716562dc8-kube-api-access-7rwrs" (OuterVolumeSpecName: "kube-api-access-7rwrs") pod "49ab45b0-1e6e-49af-9aff-3c9716562dc8" (UID: "49ab45b0-1e6e-49af-9aff-3c9716562dc8"). InnerVolumeSpecName "kube-api-access-7rwrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:24:57 crc kubenswrapper[4677]: I1007 13:24:57.529394 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rwrs\" (UniqueName: \"kubernetes.io/projected/49ab45b0-1e6e-49af-9aff-3c9716562dc8-kube-api-access-7rwrs\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:57 crc kubenswrapper[4677]: I1007 13:24:57.972739 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qljk2" event={"ID":"49ab45b0-1e6e-49af-9aff-3c9716562dc8","Type":"ContainerDied","Data":"a9f99e2c96268291a827ba334da846c8a9fd8607920300ece5f55bbabe547dbd"} Oct 07 13:24:57 crc kubenswrapper[4677]: I1007 13:24:57.972797 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f99e2c96268291a827ba334da846c8a9fd8607920300ece5f55bbabe547dbd" Oct 07 13:24:57 crc kubenswrapper[4677]: I1007 13:24:57.972870 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qljk2" Oct 07 13:25:04 crc kubenswrapper[4677]: I1007 13:25:04.770845 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-1589-account-create-7pddq"] Oct 07 13:25:04 crc kubenswrapper[4677]: E1007 13:25:04.772039 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ab45b0-1e6e-49af-9aff-3c9716562dc8" containerName="mariadb-database-create" Oct 07 13:25:04 crc kubenswrapper[4677]: I1007 13:25:04.772066 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ab45b0-1e6e-49af-9aff-3c9716562dc8" containerName="mariadb-database-create" Oct 07 13:25:04 crc kubenswrapper[4677]: I1007 13:25:04.772280 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ab45b0-1e6e-49af-9aff-3c9716562dc8" containerName="mariadb-database-create" Oct 07 13:25:04 crc kubenswrapper[4677]: I1007 13:25:04.773252 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" Oct 07 13:25:04 crc kubenswrapper[4677]: I1007 13:25:04.775026 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 07 13:25:04 crc kubenswrapper[4677]: I1007 13:25:04.784150 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-1589-account-create-7pddq"] Oct 07 13:25:04 crc kubenswrapper[4677]: I1007 13:25:04.944949 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhp7t\" (UniqueName: \"kubernetes.io/projected/1a7c7c71-b92c-4595-8b66-a041dc94c05f-kube-api-access-mhp7t\") pod \"keystone-1589-account-create-7pddq\" (UID: \"1a7c7c71-b92c-4595-8b66-a041dc94c05f\") " pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" Oct 07 13:25:05 crc kubenswrapper[4677]: I1007 13:25:05.046807 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhp7t\" (UniqueName: \"kubernetes.io/projected/1a7c7c71-b92c-4595-8b66-a041dc94c05f-kube-api-access-mhp7t\") pod \"keystone-1589-account-create-7pddq\" (UID: \"1a7c7c71-b92c-4595-8b66-a041dc94c05f\") " pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" Oct 07 13:25:05 crc kubenswrapper[4677]: I1007 13:25:05.080112 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhp7t\" (UniqueName: \"kubernetes.io/projected/1a7c7c71-b92c-4595-8b66-a041dc94c05f-kube-api-access-mhp7t\") pod \"keystone-1589-account-create-7pddq\" (UID: \"1a7c7c71-b92c-4595-8b66-a041dc94c05f\") " pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" Oct 07 13:25:05 crc kubenswrapper[4677]: I1007 13:25:05.100871 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" Oct 07 13:25:05 crc kubenswrapper[4677]: I1007 13:25:05.369118 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-1589-account-create-7pddq"] Oct 07 13:25:05 crc kubenswrapper[4677]: W1007 13:25:05.377785 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a7c7c71_b92c_4595_8b66_a041dc94c05f.slice/crio-9b134b54a2b87774194a5e28170cbc131465e2960ab602cbb805e376ae78c299 WatchSource:0}: Error finding container 9b134b54a2b87774194a5e28170cbc131465e2960ab602cbb805e376ae78c299: Status 404 returned error can't find the container with id 9b134b54a2b87774194a5e28170cbc131465e2960ab602cbb805e376ae78c299 Oct 07 13:25:06 crc kubenswrapper[4677]: I1007 13:25:06.043033 4677 generic.go:334] "Generic (PLEG): container finished" podID="1a7c7c71-b92c-4595-8b66-a041dc94c05f" containerID="81a9e406509d8a147436efcbc4e859895d1103232c9f2a2d0503ae75c417f9eb" exitCode=0 Oct 07 13:25:06 crc kubenswrapper[4677]: I1007 13:25:06.043113 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" event={"ID":"1a7c7c71-b92c-4595-8b66-a041dc94c05f","Type":"ContainerDied","Data":"81a9e406509d8a147436efcbc4e859895d1103232c9f2a2d0503ae75c417f9eb"} Oct 07 13:25:06 crc kubenswrapper[4677]: I1007 13:25:06.043503 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" event={"ID":"1a7c7c71-b92c-4595-8b66-a041dc94c05f","Type":"ContainerStarted","Data":"9b134b54a2b87774194a5e28170cbc131465e2960ab602cbb805e376ae78c299"} Oct 07 13:25:07 crc kubenswrapper[4677]: I1007 13:25:07.360595 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" Oct 07 13:25:07 crc kubenswrapper[4677]: I1007 13:25:07.483571 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhp7t\" (UniqueName: \"kubernetes.io/projected/1a7c7c71-b92c-4595-8b66-a041dc94c05f-kube-api-access-mhp7t\") pod \"1a7c7c71-b92c-4595-8b66-a041dc94c05f\" (UID: \"1a7c7c71-b92c-4595-8b66-a041dc94c05f\") " Oct 07 13:25:07 crc kubenswrapper[4677]: I1007 13:25:07.491751 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a7c7c71-b92c-4595-8b66-a041dc94c05f-kube-api-access-mhp7t" (OuterVolumeSpecName: "kube-api-access-mhp7t") pod "1a7c7c71-b92c-4595-8b66-a041dc94c05f" (UID: "1a7c7c71-b92c-4595-8b66-a041dc94c05f"). InnerVolumeSpecName "kube-api-access-mhp7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:25:07 crc kubenswrapper[4677]: I1007 13:25:07.586272 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhp7t\" (UniqueName: \"kubernetes.io/projected/1a7c7c71-b92c-4595-8b66-a041dc94c05f-kube-api-access-mhp7t\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:08 crc kubenswrapper[4677]: I1007 13:25:08.063582 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" event={"ID":"1a7c7c71-b92c-4595-8b66-a041dc94c05f","Type":"ContainerDied","Data":"9b134b54a2b87774194a5e28170cbc131465e2960ab602cbb805e376ae78c299"} Oct 07 13:25:08 crc kubenswrapper[4677]: I1007 13:25:08.063643 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b134b54a2b87774194a5e28170cbc131465e2960ab602cbb805e376ae78c299" Oct 07 13:25:08 crc kubenswrapper[4677]: I1007 13:25:08.063599 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1589-account-create-7pddq" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.357280 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-785mf"] Oct 07 13:25:10 crc kubenswrapper[4677]: E1007 13:25:10.358803 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7c7c71-b92c-4595-8b66-a041dc94c05f" containerName="mariadb-account-create" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.358835 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7c7c71-b92c-4595-8b66-a041dc94c05f" containerName="mariadb-account-create" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.358962 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a7c7c71-b92c-4595-8b66-a041dc94c05f" containerName="mariadb-account-create" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.359540 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.365390 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.365422 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.365614 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.366158 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.367984 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-vcx7s" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.373728 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-785mf"] Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.530793 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsmx\" (UniqueName: \"kubernetes.io/projected/3ca10262-0826-4aa9-ac24-b09d2d294fde-kube-api-access-xfsmx\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.531053 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-config-data\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.531149 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-combined-ca-bundle\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.631946 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-config-data\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.631987 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-combined-ca-bundle\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.632037 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsmx\" (UniqueName: \"kubernetes.io/projected/3ca10262-0826-4aa9-ac24-b09d2d294fde-kube-api-access-xfsmx\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.638726 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-config-data\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.645165 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-combined-ca-bundle\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.664184 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsmx\" (UniqueName: \"kubernetes.io/projected/3ca10262-0826-4aa9-ac24-b09d2d294fde-kube-api-access-xfsmx\") pod \"keystone-db-sync-785mf\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:10 crc kubenswrapper[4677]: I1007 13:25:10.676088 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:11 crc kubenswrapper[4677]: I1007 13:25:11.106014 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-785mf"] Oct 07 13:25:11 crc kubenswrapper[4677]: W1007 13:25:11.117755 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ca10262_0826_4aa9_ac24_b09d2d294fde.slice/crio-9a3e3a5e3977c9f7cdebf5d559ab420411cf54906aab07ed6cac209ac6b274b5 WatchSource:0}: Error finding container 9a3e3a5e3977c9f7cdebf5d559ab420411cf54906aab07ed6cac209ac6b274b5: Status 404 returned error can't find the container with id 9a3e3a5e3977c9f7cdebf5d559ab420411cf54906aab07ed6cac209ac6b274b5 Oct 07 13:25:12 crc kubenswrapper[4677]: I1007 13:25:12.100872 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-785mf" event={"ID":"3ca10262-0826-4aa9-ac24-b09d2d294fde","Type":"ContainerStarted","Data":"4d4cb2fd8469c6d6185c8a8996a9e39decdb470b52abb9be938bf38be38d6c6a"} Oct 07 13:25:12 crc kubenswrapper[4677]: I1007 13:25:12.101224 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-785mf" event={"ID":"3ca10262-0826-4aa9-ac24-b09d2d294fde","Type":"ContainerStarted","Data":"9a3e3a5e3977c9f7cdebf5d559ab420411cf54906aab07ed6cac209ac6b274b5"} Oct 07 13:25:12 crc kubenswrapper[4677]: I1007 13:25:12.133206 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-785mf" podStartSLOduration=2.1331811099999998 podStartE2EDuration="2.13318111s" podCreationTimestamp="2025-10-07 13:25:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:25:12.127142446 +0000 UTC m=+1083.612851621" watchObservedRunningTime="2025-10-07 13:25:12.13318111 +0000 UTC m=+1083.618890265" Oct 07 13:25:13 crc kubenswrapper[4677]: I1007 13:25:13.112011 4677 generic.go:334] "Generic (PLEG): container finished" podID="3ca10262-0826-4aa9-ac24-b09d2d294fde" containerID="4d4cb2fd8469c6d6185c8a8996a9e39decdb470b52abb9be938bf38be38d6c6a" exitCode=0 Oct 07 13:25:13 crc kubenswrapper[4677]: I1007 13:25:13.112126 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-785mf" event={"ID":"3ca10262-0826-4aa9-ac24-b09d2d294fde","Type":"ContainerDied","Data":"4d4cb2fd8469c6d6185c8a8996a9e39decdb470b52abb9be938bf38be38d6c6a"} Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.449711 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.591126 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-combined-ca-bundle\") pod \"3ca10262-0826-4aa9-ac24-b09d2d294fde\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.591230 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsmx\" (UniqueName: \"kubernetes.io/projected/3ca10262-0826-4aa9-ac24-b09d2d294fde-kube-api-access-xfsmx\") pod \"3ca10262-0826-4aa9-ac24-b09d2d294fde\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.591286 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-config-data\") pod \"3ca10262-0826-4aa9-ac24-b09d2d294fde\" (UID: \"3ca10262-0826-4aa9-ac24-b09d2d294fde\") " Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.600389 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca10262-0826-4aa9-ac24-b09d2d294fde-kube-api-access-xfsmx" (OuterVolumeSpecName: "kube-api-access-xfsmx") pod "3ca10262-0826-4aa9-ac24-b09d2d294fde" (UID: "3ca10262-0826-4aa9-ac24-b09d2d294fde"). InnerVolumeSpecName "kube-api-access-xfsmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.616372 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ca10262-0826-4aa9-ac24-b09d2d294fde" (UID: "3ca10262-0826-4aa9-ac24-b09d2d294fde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.652001 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-config-data" (OuterVolumeSpecName: "config-data") pod "3ca10262-0826-4aa9-ac24-b09d2d294fde" (UID: "3ca10262-0826-4aa9-ac24-b09d2d294fde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.693577 4677 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.693620 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsmx\" (UniqueName: \"kubernetes.io/projected/3ca10262-0826-4aa9-ac24-b09d2d294fde-kube-api-access-xfsmx\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:14 crc kubenswrapper[4677]: I1007 13:25:14.693636 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca10262-0826-4aa9-ac24-b09d2d294fde-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.132527 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-785mf" event={"ID":"3ca10262-0826-4aa9-ac24-b09d2d294fde","Type":"ContainerDied","Data":"9a3e3a5e3977c9f7cdebf5d559ab420411cf54906aab07ed6cac209ac6b274b5"} Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.132956 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a3e3a5e3977c9f7cdebf5d559ab420411cf54906aab07ed6cac209ac6b274b5" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.132563 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-785mf" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.337205 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lk9hg"] Oct 07 13:25:15 crc kubenswrapper[4677]: E1007 13:25:15.337564 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca10262-0826-4aa9-ac24-b09d2d294fde" containerName="keystone-db-sync" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.337587 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca10262-0826-4aa9-ac24-b09d2d294fde" containerName="keystone-db-sync" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.337703 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca10262-0826-4aa9-ac24-b09d2d294fde" containerName="keystone-db-sync" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.338197 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.341550 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.341653 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.341711 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.344490 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.344494 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-vcx7s" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.349142 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lk9hg"] Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.506395 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-scripts\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.506506 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-combined-ca-bundle\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.506621 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-config-data\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.506657 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drlx\" (UniqueName: \"kubernetes.io/projected/da4cad07-541a-45b6-a8a9-f5c123912314-kube-api-access-7drlx\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.506699 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-credential-keys\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.506754 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-fernet-keys\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.607750 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-scripts\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.607861 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-combined-ca-bundle\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.607995 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-config-data\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.608030 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7drlx\" (UniqueName: \"kubernetes.io/projected/da4cad07-541a-45b6-a8a9-f5c123912314-kube-api-access-7drlx\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.608079 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-credential-keys\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.608144 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-fernet-keys\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.614399 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-combined-ca-bundle\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.614918 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-config-data\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.615534 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-fernet-keys\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.615990 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-credential-keys\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.618613 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-scripts\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.630505 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drlx\" (UniqueName: \"kubernetes.io/projected/da4cad07-541a-45b6-a8a9-f5c123912314-kube-api-access-7drlx\") pod \"keystone-bootstrap-lk9hg\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.655490 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:15 crc kubenswrapper[4677]: I1007 13:25:15.920213 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lk9hg"] Oct 07 13:25:16 crc kubenswrapper[4677]: I1007 13:25:16.142713 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" event={"ID":"da4cad07-541a-45b6-a8a9-f5c123912314","Type":"ContainerStarted","Data":"93f40180a44d6d1d31ea20f392e51cd8cec89e02b0e0d6a08a0e11af7f1f30e8"} Oct 07 13:25:16 crc kubenswrapper[4677]: I1007 13:25:16.143044 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" event={"ID":"da4cad07-541a-45b6-a8a9-f5c123912314","Type":"ContainerStarted","Data":"4248ed127b338fbcadbcc8a0fb603d5298075fa6863fa16b24f6e4633ac4a6f6"} Oct 07 13:25:19 crc kubenswrapper[4677]: I1007 13:25:19.170863 4677 generic.go:334] "Generic (PLEG): container finished" podID="da4cad07-541a-45b6-a8a9-f5c123912314" containerID="93f40180a44d6d1d31ea20f392e51cd8cec89e02b0e0d6a08a0e11af7f1f30e8" exitCode=0 Oct 07 13:25:19 crc kubenswrapper[4677]: I1007 13:25:19.170952 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" event={"ID":"da4cad07-541a-45b6-a8a9-f5c123912314","Type":"ContainerDied","Data":"93f40180a44d6d1d31ea20f392e51cd8cec89e02b0e0d6a08a0e11af7f1f30e8"} Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.515228 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.684205 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-credential-keys\") pod \"da4cad07-541a-45b6-a8a9-f5c123912314\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.684258 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drlx\" (UniqueName: \"kubernetes.io/projected/da4cad07-541a-45b6-a8a9-f5c123912314-kube-api-access-7drlx\") pod \"da4cad07-541a-45b6-a8a9-f5c123912314\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.684320 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-fernet-keys\") pod \"da4cad07-541a-45b6-a8a9-f5c123912314\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.684380 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-scripts\") pod \"da4cad07-541a-45b6-a8a9-f5c123912314\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.684420 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-config-data\") pod \"da4cad07-541a-45b6-a8a9-f5c123912314\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.684466 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-combined-ca-bundle\") pod \"da4cad07-541a-45b6-a8a9-f5c123912314\" (UID: \"da4cad07-541a-45b6-a8a9-f5c123912314\") " Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.689596 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "da4cad07-541a-45b6-a8a9-f5c123912314" (UID: "da4cad07-541a-45b6-a8a9-f5c123912314"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.689849 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4cad07-541a-45b6-a8a9-f5c123912314-kube-api-access-7drlx" (OuterVolumeSpecName: "kube-api-access-7drlx") pod "da4cad07-541a-45b6-a8a9-f5c123912314" (UID: "da4cad07-541a-45b6-a8a9-f5c123912314"). InnerVolumeSpecName "kube-api-access-7drlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.690556 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-scripts" (OuterVolumeSpecName: "scripts") pod "da4cad07-541a-45b6-a8a9-f5c123912314" (UID: "da4cad07-541a-45b6-a8a9-f5c123912314"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.697477 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "da4cad07-541a-45b6-a8a9-f5c123912314" (UID: "da4cad07-541a-45b6-a8a9-f5c123912314"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.701794 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da4cad07-541a-45b6-a8a9-f5c123912314" (UID: "da4cad07-541a-45b6-a8a9-f5c123912314"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.702105 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-config-data" (OuterVolumeSpecName: "config-data") pod "da4cad07-541a-45b6-a8a9-f5c123912314" (UID: "da4cad07-541a-45b6-a8a9-f5c123912314"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.785928 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.785981 4677 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.786003 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.786021 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7drlx\" (UniqueName: \"kubernetes.io/projected/da4cad07-541a-45b6-a8a9-f5c123912314-kube-api-access-7drlx\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.786039 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:20 crc kubenswrapper[4677]: I1007 13:25:20.786056 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4cad07-541a-45b6-a8a9-f5c123912314-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.189359 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" event={"ID":"da4cad07-541a-45b6-a8a9-f5c123912314","Type":"ContainerDied","Data":"4248ed127b338fbcadbcc8a0fb603d5298075fa6863fa16b24f6e4633ac4a6f6"} Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.189736 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4248ed127b338fbcadbcc8a0fb603d5298075fa6863fa16b24f6e4633ac4a6f6" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.189424 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lk9hg" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.299713 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-6ddf586d4-ctx5x"] Oct 07 13:25:21 crc kubenswrapper[4677]: E1007 13:25:21.300028 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4cad07-541a-45b6-a8a9-f5c123912314" containerName="keystone-bootstrap" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.300045 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4cad07-541a-45b6-a8a9-f5c123912314" containerName="keystone-bootstrap" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.300151 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4cad07-541a-45b6-a8a9-f5c123912314" containerName="keystone-bootstrap" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.300694 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.302802 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.302863 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-public-svc" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.303044 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.303451 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.303509 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-vcx7s" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.303579 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.305018 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-internal-svc" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.313854 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6ddf586d4-ctx5x"] Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.393937 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-credential-keys\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.394039 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-internal-tls-certs\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.394161 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-fernet-keys\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.394214 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-combined-ca-bundle\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.394254 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-public-tls-certs\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.394270 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5csn\" (UniqueName: \"kubernetes.io/projected/0c702d9a-1cc2-40fa-bc18-bad260076f52-kube-api-access-b5csn\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.394326 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-config-data\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.394390 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-scripts\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496191 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-credential-keys\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496246 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-internal-tls-certs\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496288 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-fernet-keys\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496312 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-combined-ca-bundle\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496336 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-public-tls-certs\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496355 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5csn\" (UniqueName: \"kubernetes.io/projected/0c702d9a-1cc2-40fa-bc18-bad260076f52-kube-api-access-b5csn\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496379 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-config-data\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.496410 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-scripts\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.501275 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-config-data\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.501275 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-combined-ca-bundle\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.501551 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-fernet-keys\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.501763 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-public-tls-certs\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.503555 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-scripts\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.509321 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-internal-tls-certs\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.509539 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-credential-keys\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.511880 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5csn\" (UniqueName: \"kubernetes.io/projected/0c702d9a-1cc2-40fa-bc18-bad260076f52-kube-api-access-b5csn\") pod \"keystone-6ddf586d4-ctx5x\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.618348 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:21 crc kubenswrapper[4677]: I1007 13:25:21.815650 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6ddf586d4-ctx5x"] Oct 07 13:25:22 crc kubenswrapper[4677]: I1007 13:25:22.196184 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" event={"ID":"0c702d9a-1cc2-40fa-bc18-bad260076f52","Type":"ContainerStarted","Data":"8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4"} Oct 07 13:25:22 crc kubenswrapper[4677]: I1007 13:25:22.196595 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" event={"ID":"0c702d9a-1cc2-40fa-bc18-bad260076f52","Type":"ContainerStarted","Data":"c792320b307de38eaa90b367567096f15422a296cc9056b1de7b8abc3e5b8699"} Oct 07 13:25:22 crc kubenswrapper[4677]: I1007 13:25:22.196624 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:22 crc kubenswrapper[4677]: I1007 13:25:22.216652 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" podStartSLOduration=1.216631003 podStartE2EDuration="1.216631003s" podCreationTimestamp="2025-10-07 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:25:22.214011938 +0000 UTC m=+1093.699721073" watchObservedRunningTime="2025-10-07 13:25:22.216631003 +0000 UTC m=+1093.702340128" Oct 07 13:25:52 crc kubenswrapper[4677]: I1007 13:25:52.997259 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.556148 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-785mf"] Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.561874 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-785mf"] Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.576891 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lk9hg"] Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.582354 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6ddf586d4-ctx5x"] Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.582579 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" podUID="0c702d9a-1cc2-40fa-bc18-bad260076f52" containerName="keystone-api" containerID="cri-o://8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4" gracePeriod=30 Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.586763 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lk9hg"] Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.607813 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone1589-account-delete-lmkdp"] Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.608654 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.619290 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone1589-account-delete-lmkdp"] Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.680537 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkxt\" (UniqueName: \"kubernetes.io/projected/c92f8552-5fd2-448a-9019-4ce1ceb1bbe8-kube-api-access-stkxt\") pod \"keystone1589-account-delete-lmkdp\" (UID: \"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8\") " pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.781968 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkxt\" (UniqueName: \"kubernetes.io/projected/c92f8552-5fd2-448a-9019-4ce1ceb1bbe8-kube-api-access-stkxt\") pod \"keystone1589-account-delete-lmkdp\" (UID: \"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8\") " pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.815711 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkxt\" (UniqueName: \"kubernetes.io/projected/c92f8552-5fd2-448a-9019-4ce1ceb1bbe8-kube-api-access-stkxt\") pod \"keystone1589-account-delete-lmkdp\" (UID: \"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8\") " pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" Oct 07 13:25:53 crc kubenswrapper[4677]: I1007 13:25:53.940028 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" Oct 07 13:25:54 crc kubenswrapper[4677]: I1007 13:25:54.445125 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone1589-account-delete-lmkdp"] Oct 07 13:25:54 crc kubenswrapper[4677]: I1007 13:25:54.490727 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" event={"ID":"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8","Type":"ContainerStarted","Data":"8da63dbbb8e99479146895330e71a823d9f2d92b6aa6448c2ea9e18c5c37f66d"} Oct 07 13:25:55 crc kubenswrapper[4677]: I1007 13:25:55.325984 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca10262-0826-4aa9-ac24-b09d2d294fde" path="/var/lib/kubelet/pods/3ca10262-0826-4aa9-ac24-b09d2d294fde/volumes" Oct 07 13:25:55 crc kubenswrapper[4677]: I1007 13:25:55.327453 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4cad07-541a-45b6-a8a9-f5c123912314" path="/var/lib/kubelet/pods/da4cad07-541a-45b6-a8a9-f5c123912314/volumes" Oct 07 13:25:55 crc kubenswrapper[4677]: I1007 13:25:55.501916 4677 generic.go:334] "Generic (PLEG): container finished" podID="c92f8552-5fd2-448a-9019-4ce1ceb1bbe8" containerID="262af98a53197751da982603299ca6f3eec94b3f52ea708f5680dfb2dc50bf07" exitCode=0 Oct 07 13:25:55 crc kubenswrapper[4677]: I1007 13:25:55.501980 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" event={"ID":"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8","Type":"ContainerDied","Data":"262af98a53197751da982603299ca6f3eec94b3f52ea708f5680dfb2dc50bf07"} Oct 07 13:25:56 crc kubenswrapper[4677]: I1007 13:25:56.843845 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" Oct 07 13:25:56 crc kubenswrapper[4677]: I1007 13:25:56.949110 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stkxt\" (UniqueName: \"kubernetes.io/projected/c92f8552-5fd2-448a-9019-4ce1ceb1bbe8-kube-api-access-stkxt\") pod \"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8\" (UID: \"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8\") " Oct 07 13:25:56 crc kubenswrapper[4677]: I1007 13:25:56.958311 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92f8552-5fd2-448a-9019-4ce1ceb1bbe8-kube-api-access-stkxt" (OuterVolumeSpecName: "kube-api-access-stkxt") pod "c92f8552-5fd2-448a-9019-4ce1ceb1bbe8" (UID: "c92f8552-5fd2-448a-9019-4ce1ceb1bbe8"). InnerVolumeSpecName "kube-api-access-stkxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:25:56 crc kubenswrapper[4677]: I1007 13:25:56.999532 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.050812 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stkxt\" (UniqueName: \"kubernetes.io/projected/c92f8552-5fd2-448a-9019-4ce1ceb1bbe8-kube-api-access-stkxt\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.151896 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5csn\" (UniqueName: \"kubernetes.io/projected/0c702d9a-1cc2-40fa-bc18-bad260076f52-kube-api-access-b5csn\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.152038 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-credential-keys\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.152107 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-internal-tls-certs\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.152226 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-scripts\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.152914 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-public-tls-certs\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.153013 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-config-data\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.153087 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-combined-ca-bundle\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.153246 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-fernet-keys\") pod \"0c702d9a-1cc2-40fa-bc18-bad260076f52\" (UID: \"0c702d9a-1cc2-40fa-bc18-bad260076f52\") " Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.157363 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c702d9a-1cc2-40fa-bc18-bad260076f52-kube-api-access-b5csn" (OuterVolumeSpecName: "kube-api-access-b5csn") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "kube-api-access-b5csn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.157646 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-scripts" (OuterVolumeSpecName: "scripts") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.158985 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.159353 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.184667 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-config-data" (OuterVolumeSpecName: "config-data") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.188012 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.209255 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.210621 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0c702d9a-1cc2-40fa-bc18-bad260076f52" (UID: "0c702d9a-1cc2-40fa-bc18-bad260076f52"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256076 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256123 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5csn\" (UniqueName: \"kubernetes.io/projected/0c702d9a-1cc2-40fa-bc18-bad260076f52-kube-api-access-b5csn\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256145 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256165 4677 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256185 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256206 4677 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256229 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.256253 4677 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c702d9a-1cc2-40fa-bc18-bad260076f52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.520788 4677 generic.go:334] "Generic (PLEG): container finished" podID="0c702d9a-1cc2-40fa-bc18-bad260076f52" containerID="8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4" exitCode=0 Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.520877 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.520931 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" event={"ID":"0c702d9a-1cc2-40fa-bc18-bad260076f52","Type":"ContainerDied","Data":"8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4"} Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.520987 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6ddf586d4-ctx5x" event={"ID":"0c702d9a-1cc2-40fa-bc18-bad260076f52","Type":"ContainerDied","Data":"c792320b307de38eaa90b367567096f15422a296cc9056b1de7b8abc3e5b8699"} Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.521020 4677 scope.go:117] "RemoveContainer" containerID="8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.526119 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" event={"ID":"c92f8552-5fd2-448a-9019-4ce1ceb1bbe8","Type":"ContainerDied","Data":"8da63dbbb8e99479146895330e71a823d9f2d92b6aa6448c2ea9e18c5c37f66d"} Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.526170 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da63dbbb8e99479146895330e71a823d9f2d92b6aa6448c2ea9e18c5c37f66d" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.526131 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1589-account-delete-lmkdp" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.561636 4677 scope.go:117] "RemoveContainer" containerID="8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4" Oct 07 13:25:57 crc kubenswrapper[4677]: E1007 13:25:57.562883 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4\": container with ID starting with 8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4 not found: ID does not exist" containerID="8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.562956 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4"} err="failed to get container status \"8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4\": rpc error: code = NotFound desc = could not find container \"8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4\": container with ID starting with 8c363a5bb37d6843c4f53b1ab28e10ad75a39c81f80ce548192cebc51845a2e4 not found: ID does not exist" Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.563233 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6ddf586d4-ctx5x"] Oct 07 13:25:57 crc kubenswrapper[4677]: I1007 13:25:57.572074 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-6ddf586d4-ctx5x"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.660877 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-1589-account-create-7pddq"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.671087 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone1589-account-delete-lmkdp"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.677840 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qljk2"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.682069 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-1589-account-create-7pddq"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.686238 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone1589-account-delete-lmkdp"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.690517 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qljk2"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.806340 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z4tf5"] Oct 07 13:25:58 crc kubenswrapper[4677]: E1007 13:25:58.806613 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92f8552-5fd2-448a-9019-4ce1ceb1bbe8" containerName="mariadb-account-delete" Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.806633 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92f8552-5fd2-448a-9019-4ce1ceb1bbe8" containerName="mariadb-account-delete" Oct 07 13:25:58 crc kubenswrapper[4677]: E1007 13:25:58.806662 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c702d9a-1cc2-40fa-bc18-bad260076f52" containerName="keystone-api" Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.806670 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c702d9a-1cc2-40fa-bc18-bad260076f52" containerName="keystone-api" Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.806798 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92f8552-5fd2-448a-9019-4ce1ceb1bbe8" containerName="mariadb-account-delete" Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.806817 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c702d9a-1cc2-40fa-bc18-bad260076f52" containerName="keystone-api" Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.807304 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z4tf5" Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.817054 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z4tf5"] Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.881367 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brqfl\" (UniqueName: \"kubernetes.io/projected/10d26766-ede1-405c-8838-2e9273105a22-kube-api-access-brqfl\") pod \"keystone-db-create-z4tf5\" (UID: \"10d26766-ede1-405c-8838-2e9273105a22\") " pod="keystone-kuttl-tests/keystone-db-create-z4tf5" Oct 07 13:25:58 crc kubenswrapper[4677]: I1007 13:25:58.982820 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brqfl\" (UniqueName: \"kubernetes.io/projected/10d26766-ede1-405c-8838-2e9273105a22-kube-api-access-brqfl\") pod \"keystone-db-create-z4tf5\" (UID: \"10d26766-ede1-405c-8838-2e9273105a22\") " pod="keystone-kuttl-tests/keystone-db-create-z4tf5" Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.011682 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brqfl\" (UniqueName: \"kubernetes.io/projected/10d26766-ede1-405c-8838-2e9273105a22-kube-api-access-brqfl\") pod \"keystone-db-create-z4tf5\" (UID: \"10d26766-ede1-405c-8838-2e9273105a22\") " pod="keystone-kuttl-tests/keystone-db-create-z4tf5" Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.121724 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z4tf5" Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.311647 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c702d9a-1cc2-40fa-bc18-bad260076f52" path="/var/lib/kubelet/pods/0c702d9a-1cc2-40fa-bc18-bad260076f52/volumes" Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.312626 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a7c7c71-b92c-4595-8b66-a041dc94c05f" path="/var/lib/kubelet/pods/1a7c7c71-b92c-4595-8b66-a041dc94c05f/volumes" Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.313049 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ab45b0-1e6e-49af-9aff-3c9716562dc8" path="/var/lib/kubelet/pods/49ab45b0-1e6e-49af-9aff-3c9716562dc8/volumes" Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.313483 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92f8552-5fd2-448a-9019-4ce1ceb1bbe8" path="/var/lib/kubelet/pods/c92f8552-5fd2-448a-9019-4ce1ceb1bbe8/volumes" Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.347244 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z4tf5"] Oct 07 13:25:59 crc kubenswrapper[4677]: W1007 13:25:59.351963 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d26766_ede1_405c_8838_2e9273105a22.slice/crio-e60b1a9d4fa5c9f0a6c41553f8a18b913c41f9eba4ab5aeeb5506f3adb6dc5b2 WatchSource:0}: Error finding container e60b1a9d4fa5c9f0a6c41553f8a18b913c41f9eba4ab5aeeb5506f3adb6dc5b2: Status 404 returned error can't find the container with id e60b1a9d4fa5c9f0a6c41553f8a18b913c41f9eba4ab5aeeb5506f3adb6dc5b2 Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.547529 4677 generic.go:334] "Generic (PLEG): container finished" podID="10d26766-ede1-405c-8838-2e9273105a22" containerID="4e147f80d89000dd8d28c33f3913c5594ffcd6e05509c6b6bfc82176abcc9c3a" exitCode=0 Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.547580 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-z4tf5" event={"ID":"10d26766-ede1-405c-8838-2e9273105a22","Type":"ContainerDied","Data":"4e147f80d89000dd8d28c33f3913c5594ffcd6e05509c6b6bfc82176abcc9c3a"} Oct 07 13:25:59 crc kubenswrapper[4677]: I1007 13:25:59.547827 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-z4tf5" event={"ID":"10d26766-ede1-405c-8838-2e9273105a22","Type":"ContainerStarted","Data":"e60b1a9d4fa5c9f0a6c41553f8a18b913c41f9eba4ab5aeeb5506f3adb6dc5b2"} Oct 07 13:26:00 crc kubenswrapper[4677]: I1007 13:26:00.870455 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z4tf5" Oct 07 13:26:01 crc kubenswrapper[4677]: I1007 13:26:01.018196 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brqfl\" (UniqueName: \"kubernetes.io/projected/10d26766-ede1-405c-8838-2e9273105a22-kube-api-access-brqfl\") pod \"10d26766-ede1-405c-8838-2e9273105a22\" (UID: \"10d26766-ede1-405c-8838-2e9273105a22\") " Oct 07 13:26:01 crc kubenswrapper[4677]: I1007 13:26:01.025648 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d26766-ede1-405c-8838-2e9273105a22-kube-api-access-brqfl" (OuterVolumeSpecName: "kube-api-access-brqfl") pod "10d26766-ede1-405c-8838-2e9273105a22" (UID: "10d26766-ede1-405c-8838-2e9273105a22"). InnerVolumeSpecName "kube-api-access-brqfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:01 crc kubenswrapper[4677]: I1007 13:26:01.119905 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brqfl\" (UniqueName: \"kubernetes.io/projected/10d26766-ede1-405c-8838-2e9273105a22-kube-api-access-brqfl\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:01 crc kubenswrapper[4677]: I1007 13:26:01.566496 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-z4tf5" event={"ID":"10d26766-ede1-405c-8838-2e9273105a22","Type":"ContainerDied","Data":"e60b1a9d4fa5c9f0a6c41553f8a18b913c41f9eba4ab5aeeb5506f3adb6dc5b2"} Oct 07 13:26:01 crc kubenswrapper[4677]: I1007 13:26:01.566840 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e60b1a9d4fa5c9f0a6c41553f8a18b913c41f9eba4ab5aeeb5506f3adb6dc5b2" Oct 07 13:26:01 crc kubenswrapper[4677]: I1007 13:26:01.566661 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z4tf5" Oct 07 13:26:08 crc kubenswrapper[4677]: I1007 13:26:08.844146 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-fde5-account-create-qxdpj"] Oct 07 13:26:08 crc kubenswrapper[4677]: E1007 13:26:08.845171 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d26766-ede1-405c-8838-2e9273105a22" containerName="mariadb-database-create" Oct 07 13:26:08 crc kubenswrapper[4677]: I1007 13:26:08.845192 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d26766-ede1-405c-8838-2e9273105a22" containerName="mariadb-database-create" Oct 07 13:26:08 crc kubenswrapper[4677]: I1007 13:26:08.845460 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d26766-ede1-405c-8838-2e9273105a22" containerName="mariadb-database-create" Oct 07 13:26:08 crc kubenswrapper[4677]: I1007 13:26:08.846111 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" Oct 07 13:26:08 crc kubenswrapper[4677]: I1007 13:26:08.849839 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 07 13:26:08 crc kubenswrapper[4677]: I1007 13:26:08.861021 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-fde5-account-create-qxdpj"] Oct 07 13:26:08 crc kubenswrapper[4677]: I1007 13:26:08.938820 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8lgx\" (UniqueName: \"kubernetes.io/projected/efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1-kube-api-access-x8lgx\") pod \"keystone-fde5-account-create-qxdpj\" (UID: \"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1\") " pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.040096 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8lgx\" (UniqueName: \"kubernetes.io/projected/efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1-kube-api-access-x8lgx\") pod \"keystone-fde5-account-create-qxdpj\" (UID: \"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1\") " pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.070629 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8lgx\" (UniqueName: \"kubernetes.io/projected/efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1-kube-api-access-x8lgx\") pod \"keystone-fde5-account-create-qxdpj\" (UID: \"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1\") " pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.176666 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.413784 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-fde5-account-create-qxdpj"] Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.428483 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.647211 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" event={"ID":"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1","Type":"ContainerStarted","Data":"42be7a25f38277e2f0f91260a907388c468e63e3d2f057825c401b0019340f9a"} Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.647258 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" event={"ID":"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1","Type":"ContainerStarted","Data":"2c8851c7e763ceef5692d0bee820b244331ef1d0d2c825aefcb6ff39adc501c4"} Oct 07 13:26:09 crc kubenswrapper[4677]: I1007 13:26:09.669307 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" podStartSLOduration=1.669280882 podStartE2EDuration="1.669280882s" podCreationTimestamp="2025-10-07 13:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:26:09.666130841 +0000 UTC m=+1141.151839976" watchObservedRunningTime="2025-10-07 13:26:09.669280882 +0000 UTC m=+1141.154990037" Oct 07 13:26:10 crc kubenswrapper[4677]: I1007 13:26:10.660419 4677 generic.go:334] "Generic (PLEG): container finished" podID="efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1" containerID="42be7a25f38277e2f0f91260a907388c468e63e3d2f057825c401b0019340f9a" exitCode=0 Oct 07 13:26:10 crc kubenswrapper[4677]: I1007 13:26:10.660602 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" event={"ID":"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1","Type":"ContainerDied","Data":"42be7a25f38277e2f0f91260a907388c468e63e3d2f057825c401b0019340f9a"} Oct 07 13:26:12 crc kubenswrapper[4677]: I1007 13:26:12.063025 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" Oct 07 13:26:12 crc kubenswrapper[4677]: I1007 13:26:12.196709 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8lgx\" (UniqueName: \"kubernetes.io/projected/efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1-kube-api-access-x8lgx\") pod \"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1\" (UID: \"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1\") " Oct 07 13:26:12 crc kubenswrapper[4677]: I1007 13:26:12.205965 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1-kube-api-access-x8lgx" (OuterVolumeSpecName: "kube-api-access-x8lgx") pod "efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1" (UID: "efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1"). InnerVolumeSpecName "kube-api-access-x8lgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:12 crc kubenswrapper[4677]: I1007 13:26:12.298778 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8lgx\" (UniqueName: \"kubernetes.io/projected/efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1-kube-api-access-x8lgx\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:12 crc kubenswrapper[4677]: I1007 13:26:12.681992 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" Oct 07 13:26:12 crc kubenswrapper[4677]: I1007 13:26:12.681958 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-fde5-account-create-qxdpj" event={"ID":"efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1","Type":"ContainerDied","Data":"2c8851c7e763ceef5692d0bee820b244331ef1d0d2c825aefcb6ff39adc501c4"} Oct 07 13:26:12 crc kubenswrapper[4677]: I1007 13:26:12.682131 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8851c7e763ceef5692d0bee820b244331ef1d0d2c825aefcb6ff39adc501c4" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.435370 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4rbcx"] Oct 07 13:26:14 crc kubenswrapper[4677]: E1007 13:26:14.436904 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1" containerName="mariadb-account-create" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.436990 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1" containerName="mariadb-account-create" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.437234 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1" containerName="mariadb-account-create" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.437810 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.440575 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.440708 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-r8pj8" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.440868 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.441908 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.454008 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4rbcx"] Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.535397 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5mhn\" (UniqueName: \"kubernetes.io/projected/f009e54b-31c5-46c8-921e-38ce4c2efc7d-kube-api-access-v5mhn\") pod \"keystone-db-sync-4rbcx\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.535738 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f009e54b-31c5-46c8-921e-38ce4c2efc7d-config-data\") pod \"keystone-db-sync-4rbcx\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.636856 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f009e54b-31c5-46c8-921e-38ce4c2efc7d-config-data\") pod \"keystone-db-sync-4rbcx\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.636991 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5mhn\" (UniqueName: \"kubernetes.io/projected/f009e54b-31c5-46c8-921e-38ce4c2efc7d-kube-api-access-v5mhn\") pod \"keystone-db-sync-4rbcx\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.645309 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f009e54b-31c5-46c8-921e-38ce4c2efc7d-config-data\") pod \"keystone-db-sync-4rbcx\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.667535 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5mhn\" (UniqueName: \"kubernetes.io/projected/f009e54b-31c5-46c8-921e-38ce4c2efc7d-kube-api-access-v5mhn\") pod \"keystone-db-sync-4rbcx\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:14 crc kubenswrapper[4677]: I1007 13:26:14.772246 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:15 crc kubenswrapper[4677]: I1007 13:26:15.260419 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4rbcx"] Oct 07 13:26:15 crc kubenswrapper[4677]: I1007 13:26:15.709581 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" event={"ID":"f009e54b-31c5-46c8-921e-38ce4c2efc7d","Type":"ContainerStarted","Data":"cf6d3dd7ac8321d14de7d9be095d826eee31026829d19d639d8e4ff79def663e"} Oct 07 13:26:15 crc kubenswrapper[4677]: I1007 13:26:15.709909 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" event={"ID":"f009e54b-31c5-46c8-921e-38ce4c2efc7d","Type":"ContainerStarted","Data":"7a1c87879c3d8dc0296f5bbac7e2f7c6918633bceab6345f289ceff085412c9f"} Oct 07 13:26:15 crc kubenswrapper[4677]: I1007 13:26:15.733107 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" podStartSLOduration=1.733078978 podStartE2EDuration="1.733078978s" podCreationTimestamp="2025-10-07 13:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:26:15.732266785 +0000 UTC m=+1147.217975910" watchObservedRunningTime="2025-10-07 13:26:15.733078978 +0000 UTC m=+1147.218788133" Oct 07 13:26:17 crc kubenswrapper[4677]: I1007 13:26:17.735613 4677 generic.go:334] "Generic (PLEG): container finished" podID="f009e54b-31c5-46c8-921e-38ce4c2efc7d" containerID="cf6d3dd7ac8321d14de7d9be095d826eee31026829d19d639d8e4ff79def663e" exitCode=0 Oct 07 13:26:17 crc kubenswrapper[4677]: I1007 13:26:17.735709 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" event={"ID":"f009e54b-31c5-46c8-921e-38ce4c2efc7d","Type":"ContainerDied","Data":"cf6d3dd7ac8321d14de7d9be095d826eee31026829d19d639d8e4ff79def663e"} Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.072625 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.213394 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f009e54b-31c5-46c8-921e-38ce4c2efc7d-config-data\") pod \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.213588 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5mhn\" (UniqueName: \"kubernetes.io/projected/f009e54b-31c5-46c8-921e-38ce4c2efc7d-kube-api-access-v5mhn\") pod \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\" (UID: \"f009e54b-31c5-46c8-921e-38ce4c2efc7d\") " Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.220049 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f009e54b-31c5-46c8-921e-38ce4c2efc7d-kube-api-access-v5mhn" (OuterVolumeSpecName: "kube-api-access-v5mhn") pod "f009e54b-31c5-46c8-921e-38ce4c2efc7d" (UID: "f009e54b-31c5-46c8-921e-38ce4c2efc7d"). InnerVolumeSpecName "kube-api-access-v5mhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.253725 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f009e54b-31c5-46c8-921e-38ce4c2efc7d-config-data" (OuterVolumeSpecName: "config-data") pod "f009e54b-31c5-46c8-921e-38ce4c2efc7d" (UID: "f009e54b-31c5-46c8-921e-38ce4c2efc7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.315406 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f009e54b-31c5-46c8-921e-38ce4c2efc7d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.315486 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5mhn\" (UniqueName: \"kubernetes.io/projected/f009e54b-31c5-46c8-921e-38ce4c2efc7d-kube-api-access-v5mhn\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.754215 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" event={"ID":"f009e54b-31c5-46c8-921e-38ce4c2efc7d","Type":"ContainerDied","Data":"7a1c87879c3d8dc0296f5bbac7e2f7c6918633bceab6345f289ceff085412c9f"} Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.754263 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a1c87879c3d8dc0296f5bbac7e2f7c6918633bceab6345f289ceff085412c9f" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.754289 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4rbcx" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.941380 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hhxfq"] Oct 07 13:26:19 crc kubenswrapper[4677]: E1007 13:26:19.941847 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f009e54b-31c5-46c8-921e-38ce4c2efc7d" containerName="keystone-db-sync" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.941895 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f009e54b-31c5-46c8-921e-38ce4c2efc7d" containerName="keystone-db-sync" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.942164 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="f009e54b-31c5-46c8-921e-38ce4c2efc7d" containerName="keystone-db-sync" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.945873 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.954122 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.954229 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.954359 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.954546 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-r8pj8" Oct 07 13:26:19 crc kubenswrapper[4677]: I1007 13:26:19.957157 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hhxfq"] Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.025990 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56k8\" (UniqueName: \"kubernetes.io/projected/32a0013b-5a72-4319-9f2a-fe6672da61e3-kube-api-access-s56k8\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.026035 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-credential-keys\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.026062 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-scripts\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.026107 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-config-data\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.026160 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-fernet-keys\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.126955 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-fernet-keys\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.127073 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56k8\" (UniqueName: \"kubernetes.io/projected/32a0013b-5a72-4319-9f2a-fe6672da61e3-kube-api-access-s56k8\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.127099 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-credential-keys\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.127124 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-scripts\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.127168 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-config-data\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.130888 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-credential-keys\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.130965 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-config-data\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.133753 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-scripts\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.134106 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-fernet-keys\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.146866 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56k8\" (UniqueName: \"kubernetes.io/projected/32a0013b-5a72-4319-9f2a-fe6672da61e3-kube-api-access-s56k8\") pod \"keystone-bootstrap-hhxfq\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.270736 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:20 crc kubenswrapper[4677]: I1007 13:26:20.810028 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hhxfq"] Oct 07 13:26:21 crc kubenswrapper[4677]: I1007 13:26:21.770212 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" event={"ID":"32a0013b-5a72-4319-9f2a-fe6672da61e3","Type":"ContainerStarted","Data":"711d70acf68909e2e4ff8981d2aa59365aefab24e76c2d9b01eee732ffc53e78"} Oct 07 13:26:21 crc kubenswrapper[4677]: I1007 13:26:21.770941 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" event={"ID":"32a0013b-5a72-4319-9f2a-fe6672da61e3","Type":"ContainerStarted","Data":"39e03e5cd9e2895e328d0e3ddd49907612e8c93b48c604664d3814a6ea838dd6"} Oct 07 13:26:23 crc kubenswrapper[4677]: I1007 13:26:23.803191 4677 generic.go:334] "Generic (PLEG): container finished" podID="32a0013b-5a72-4319-9f2a-fe6672da61e3" containerID="711d70acf68909e2e4ff8981d2aa59365aefab24e76c2d9b01eee732ffc53e78" exitCode=0 Oct 07 13:26:23 crc kubenswrapper[4677]: I1007 13:26:23.803279 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" event={"ID":"32a0013b-5a72-4319-9f2a-fe6672da61e3","Type":"ContainerDied","Data":"711d70acf68909e2e4ff8981d2aa59365aefab24e76c2d9b01eee732ffc53e78"} Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.167548 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.307497 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-scripts\") pod \"32a0013b-5a72-4319-9f2a-fe6672da61e3\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.307754 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-fernet-keys\") pod \"32a0013b-5a72-4319-9f2a-fe6672da61e3\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.307818 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-config-data\") pod \"32a0013b-5a72-4319-9f2a-fe6672da61e3\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.307859 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-credential-keys\") pod \"32a0013b-5a72-4319-9f2a-fe6672da61e3\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.307933 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s56k8\" (UniqueName: \"kubernetes.io/projected/32a0013b-5a72-4319-9f2a-fe6672da61e3-kube-api-access-s56k8\") pod \"32a0013b-5a72-4319-9f2a-fe6672da61e3\" (UID: \"32a0013b-5a72-4319-9f2a-fe6672da61e3\") " Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.321343 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a0013b-5a72-4319-9f2a-fe6672da61e3-kube-api-access-s56k8" (OuterVolumeSpecName: "kube-api-access-s56k8") pod "32a0013b-5a72-4319-9f2a-fe6672da61e3" (UID: "32a0013b-5a72-4319-9f2a-fe6672da61e3"). InnerVolumeSpecName "kube-api-access-s56k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.321741 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "32a0013b-5a72-4319-9f2a-fe6672da61e3" (UID: "32a0013b-5a72-4319-9f2a-fe6672da61e3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.324118 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "32a0013b-5a72-4319-9f2a-fe6672da61e3" (UID: "32a0013b-5a72-4319-9f2a-fe6672da61e3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.324717 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-scripts" (OuterVolumeSpecName: "scripts") pod "32a0013b-5a72-4319-9f2a-fe6672da61e3" (UID: "32a0013b-5a72-4319-9f2a-fe6672da61e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.343464 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-config-data" (OuterVolumeSpecName: "config-data") pod "32a0013b-5a72-4319-9f2a-fe6672da61e3" (UID: "32a0013b-5a72-4319-9f2a-fe6672da61e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.410153 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.410460 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.410658 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s56k8\" (UniqueName: \"kubernetes.io/projected/32a0013b-5a72-4319-9f2a-fe6672da61e3-kube-api-access-s56k8\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.410776 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.410884 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32a0013b-5a72-4319-9f2a-fe6672da61e3-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.822810 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" event={"ID":"32a0013b-5a72-4319-9f2a-fe6672da61e3","Type":"ContainerDied","Data":"39e03e5cd9e2895e328d0e3ddd49907612e8c93b48c604664d3814a6ea838dd6"} Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.822871 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e03e5cd9e2895e328d0e3ddd49907612e8c93b48c604664d3814a6ea838dd6" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.822943 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hhxfq" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.943041 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-678d98fc8-7qm2p"] Oct 07 13:26:25 crc kubenswrapper[4677]: E1007 13:26:25.943497 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a0013b-5a72-4319-9f2a-fe6672da61e3" containerName="keystone-bootstrap" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.943538 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a0013b-5a72-4319-9f2a-fe6672da61e3" containerName="keystone-bootstrap" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.943831 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a0013b-5a72-4319-9f2a-fe6672da61e3" containerName="keystone-bootstrap" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.944900 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.948800 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.948902 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-r8pj8" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.949204 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.949707 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:26:25 crc kubenswrapper[4677]: I1007 13:26:25.960308 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-678d98fc8-7qm2p"] Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.118913 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-config-data\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.118983 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9bxf\" (UniqueName: \"kubernetes.io/projected/7b345892-ea2d-4828-8981-c1a7a43c4cc5-kube-api-access-x9bxf\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.119222 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-scripts\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.119290 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-fernet-keys\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.119521 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-credential-keys\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.221077 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-scripts\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.222185 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-fernet-keys\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.222362 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-credential-keys\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.222526 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-config-data\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.222687 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9bxf\" (UniqueName: \"kubernetes.io/projected/7b345892-ea2d-4828-8981-c1a7a43c4cc5-kube-api-access-x9bxf\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.225824 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-scripts\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.226598 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-config-data\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.227020 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-credential-keys\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.229923 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-fernet-keys\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.243400 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9bxf\" (UniqueName: \"kubernetes.io/projected/7b345892-ea2d-4828-8981-c1a7a43c4cc5-kube-api-access-x9bxf\") pod \"keystone-678d98fc8-7qm2p\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.283666 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.494155 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-678d98fc8-7qm2p"] Oct 07 13:26:26 crc kubenswrapper[4677]: I1007 13:26:26.833245 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" event={"ID":"7b345892-ea2d-4828-8981-c1a7a43c4cc5","Type":"ContainerStarted","Data":"a9b73d17b1ba6946d7f59c95533fadb42add437047dd154c4e9985b7b6a2a934"} Oct 07 13:26:27 crc kubenswrapper[4677]: I1007 13:26:27.846142 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" event={"ID":"7b345892-ea2d-4828-8981-c1a7a43c4cc5","Type":"ContainerStarted","Data":"8faf93a6b7fbd929de78607df8313b8cb2b3855ed30dc1b789cf0d9150cb15d6"} Oct 07 13:26:27 crc kubenswrapper[4677]: I1007 13:26:27.846338 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:26:27 crc kubenswrapper[4677]: I1007 13:26:27.871221 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" podStartSLOduration=2.871198636 podStartE2EDuration="2.871198636s" podCreationTimestamp="2025-10-07 13:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:26:27.868280572 +0000 UTC m=+1159.353989727" watchObservedRunningTime="2025-10-07 13:26:27.871198636 +0000 UTC m=+1159.356907791" Oct 07 13:26:57 crc kubenswrapper[4677]: I1007 13:26:57.608989 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:27:10 crc kubenswrapper[4677]: I1007 13:27:10.917211 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:27:10 crc kubenswrapper[4677]: I1007 13:27:10.919756 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.358456 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4rbcx"] Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.377406 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hhxfq"] Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.381849 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4rbcx"] Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.386483 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hhxfq"] Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.390211 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-678d98fc8-7qm2p"] Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.390486 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" podUID="7b345892-ea2d-4828-8981-c1a7a43c4cc5" containerName="keystone-api" containerID="cri-o://8faf93a6b7fbd929de78607df8313b8cb2b3855ed30dc1b789cf0d9150cb15d6" gracePeriod=30 Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.397051 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystonefde5-account-delete-ctnq8"] Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.397942 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.410351 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonefde5-account-delete-ctnq8"] Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.448972 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc9jx\" (UniqueName: \"kubernetes.io/projected/5873850b-1d72-48cb-b2d7-7e065541b888-kube-api-access-qc9jx\") pod \"keystonefde5-account-delete-ctnq8\" (UID: \"5873850b-1d72-48cb-b2d7-7e065541b888\") " pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.549828 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc9jx\" (UniqueName: \"kubernetes.io/projected/5873850b-1d72-48cb-b2d7-7e065541b888-kube-api-access-qc9jx\") pod \"keystonefde5-account-delete-ctnq8\" (UID: \"5873850b-1d72-48cb-b2d7-7e065541b888\") " pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.570837 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc9jx\" (UniqueName: \"kubernetes.io/projected/5873850b-1d72-48cb-b2d7-7e065541b888-kube-api-access-qc9jx\") pod \"keystonefde5-account-delete-ctnq8\" (UID: \"5873850b-1d72-48cb-b2d7-7e065541b888\") " pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" Oct 07 13:27:16 crc kubenswrapper[4677]: I1007 13:27:16.715048 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" Oct 07 13:27:17 crc kubenswrapper[4677]: I1007 13:27:17.224654 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonefde5-account-delete-ctnq8"] Oct 07 13:27:17 crc kubenswrapper[4677]: I1007 13:27:17.316522 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a0013b-5a72-4319-9f2a-fe6672da61e3" path="/var/lib/kubelet/pods/32a0013b-5a72-4319-9f2a-fe6672da61e3/volumes" Oct 07 13:27:17 crc kubenswrapper[4677]: I1007 13:27:17.317859 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f009e54b-31c5-46c8-921e-38ce4c2efc7d" path="/var/lib/kubelet/pods/f009e54b-31c5-46c8-921e-38ce4c2efc7d/volumes" Oct 07 13:27:18 crc kubenswrapper[4677]: I1007 13:27:18.248066 4677 generic.go:334] "Generic (PLEG): container finished" podID="5873850b-1d72-48cb-b2d7-7e065541b888" containerID="381198013d79ab3c7228dac947aa677060b49de0637755b01d97c879f2afc651" exitCode=0 Oct 07 13:27:18 crc kubenswrapper[4677]: I1007 13:27:18.248190 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" event={"ID":"5873850b-1d72-48cb-b2d7-7e065541b888","Type":"ContainerDied","Data":"381198013d79ab3c7228dac947aa677060b49de0637755b01d97c879f2afc651"} Oct 07 13:27:18 crc kubenswrapper[4677]: I1007 13:27:18.248452 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" event={"ID":"5873850b-1d72-48cb-b2d7-7e065541b888","Type":"ContainerStarted","Data":"fd59f63de6354f2b8ba19134817a62fa568158b687bb2a163343943637bbd17e"} Oct 07 13:27:19 crc kubenswrapper[4677]: I1007 13:27:19.539767 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" Oct 07 13:27:19 crc kubenswrapper[4677]: I1007 13:27:19.693651 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc9jx\" (UniqueName: \"kubernetes.io/projected/5873850b-1d72-48cb-b2d7-7e065541b888-kube-api-access-qc9jx\") pod \"5873850b-1d72-48cb-b2d7-7e065541b888\" (UID: \"5873850b-1d72-48cb-b2d7-7e065541b888\") " Oct 07 13:27:19 crc kubenswrapper[4677]: I1007 13:27:19.699115 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5873850b-1d72-48cb-b2d7-7e065541b888-kube-api-access-qc9jx" (OuterVolumeSpecName: "kube-api-access-qc9jx") pod "5873850b-1d72-48cb-b2d7-7e065541b888" (UID: "5873850b-1d72-48cb-b2d7-7e065541b888"). InnerVolumeSpecName "kube-api-access-qc9jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:27:19 crc kubenswrapper[4677]: I1007 13:27:19.795722 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc9jx\" (UniqueName: \"kubernetes.io/projected/5873850b-1d72-48cb-b2d7-7e065541b888-kube-api-access-qc9jx\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.266984 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.266978 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonefde5-account-delete-ctnq8" event={"ID":"5873850b-1d72-48cb-b2d7-7e065541b888","Type":"ContainerDied","Data":"fd59f63de6354f2b8ba19134817a62fa568158b687bb2a163343943637bbd17e"} Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.267187 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd59f63de6354f2b8ba19134817a62fa568158b687bb2a163343943637bbd17e" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.270712 4677 generic.go:334] "Generic (PLEG): container finished" podID="7b345892-ea2d-4828-8981-c1a7a43c4cc5" containerID="8faf93a6b7fbd929de78607df8313b8cb2b3855ed30dc1b789cf0d9150cb15d6" exitCode=0 Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.270755 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" event={"ID":"7b345892-ea2d-4828-8981-c1a7a43c4cc5","Type":"ContainerDied","Data":"8faf93a6b7fbd929de78607df8313b8cb2b3855ed30dc1b789cf0d9150cb15d6"} Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.578912 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.708758 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-credential-keys\") pod \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.708851 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-fernet-keys\") pod \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.708918 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-scripts\") pod \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.708949 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9bxf\" (UniqueName: \"kubernetes.io/projected/7b345892-ea2d-4828-8981-c1a7a43c4cc5-kube-api-access-x9bxf\") pod \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.709018 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-config-data\") pod \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\" (UID: \"7b345892-ea2d-4828-8981-c1a7a43c4cc5\") " Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.712573 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7b345892-ea2d-4828-8981-c1a7a43c4cc5" (UID: "7b345892-ea2d-4828-8981-c1a7a43c4cc5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.712605 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7b345892-ea2d-4828-8981-c1a7a43c4cc5" (UID: "7b345892-ea2d-4828-8981-c1a7a43c4cc5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.714071 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b345892-ea2d-4828-8981-c1a7a43c4cc5-kube-api-access-x9bxf" (OuterVolumeSpecName: "kube-api-access-x9bxf") pod "7b345892-ea2d-4828-8981-c1a7a43c4cc5" (UID: "7b345892-ea2d-4828-8981-c1a7a43c4cc5"). InnerVolumeSpecName "kube-api-access-x9bxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.714172 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-scripts" (OuterVolumeSpecName: "scripts") pod "7b345892-ea2d-4828-8981-c1a7a43c4cc5" (UID: "7b345892-ea2d-4828-8981-c1a7a43c4cc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.730636 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-config-data" (OuterVolumeSpecName: "config-data") pod "7b345892-ea2d-4828-8981-c1a7a43c4cc5" (UID: "7b345892-ea2d-4828-8981-c1a7a43c4cc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.810778 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.810810 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.810821 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.810832 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9bxf\" (UniqueName: \"kubernetes.io/projected/7b345892-ea2d-4828-8981-c1a7a43c4cc5-kube-api-access-x9bxf\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:20 crc kubenswrapper[4677]: I1007 13:27:20.810841 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b345892-ea2d-4828-8981-c1a7a43c4cc5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.299131 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.299112 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-678d98fc8-7qm2p" event={"ID":"7b345892-ea2d-4828-8981-c1a7a43c4cc5","Type":"ContainerDied","Data":"a9b73d17b1ba6946d7f59c95533fadb42add437047dd154c4e9985b7b6a2a934"} Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.299550 4677 scope.go:117] "RemoveContainer" containerID="8faf93a6b7fbd929de78607df8313b8cb2b3855ed30dc1b789cf0d9150cb15d6" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.349719 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-678d98fc8-7qm2p"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.356298 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-678d98fc8-7qm2p"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.397791 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z4tf5"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.403040 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z4tf5"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.408827 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-fde5-account-create-qxdpj"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.414637 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-fde5-account-create-qxdpj"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.418666 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonefde5-account-delete-ctnq8"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.422990 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystonefde5-account-delete-ctnq8"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.572148 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-zhhhm"] Oct 07 13:27:21 crc kubenswrapper[4677]: E1007 13:27:21.572718 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b345892-ea2d-4828-8981-c1a7a43c4cc5" containerName="keystone-api" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.572743 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b345892-ea2d-4828-8981-c1a7a43c4cc5" containerName="keystone-api" Oct 07 13:27:21 crc kubenswrapper[4677]: E1007 13:27:21.572758 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5873850b-1d72-48cb-b2d7-7e065541b888" containerName="mariadb-account-delete" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.572767 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="5873850b-1d72-48cb-b2d7-7e065541b888" containerName="mariadb-account-delete" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.572909 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="5873850b-1d72-48cb-b2d7-7e065541b888" containerName="mariadb-account-delete" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.572930 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b345892-ea2d-4828-8981-c1a7a43c4cc5" containerName="keystone-api" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.573458 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-zhhhm" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.583284 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-zhhhm"] Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.723032 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7jq\" (UniqueName: \"kubernetes.io/projected/1311532b-7d0f-44bd-9e3b-4f910a018031-kube-api-access-zv7jq\") pod \"keystone-db-create-zhhhm\" (UID: \"1311532b-7d0f-44bd-9e3b-4f910a018031\") " pod="keystone-kuttl-tests/keystone-db-create-zhhhm" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.824844 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7jq\" (UniqueName: \"kubernetes.io/projected/1311532b-7d0f-44bd-9e3b-4f910a018031-kube-api-access-zv7jq\") pod \"keystone-db-create-zhhhm\" (UID: \"1311532b-7d0f-44bd-9e3b-4f910a018031\") " pod="keystone-kuttl-tests/keystone-db-create-zhhhm" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.852707 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7jq\" (UniqueName: \"kubernetes.io/projected/1311532b-7d0f-44bd-9e3b-4f910a018031-kube-api-access-zv7jq\") pod \"keystone-db-create-zhhhm\" (UID: \"1311532b-7d0f-44bd-9e3b-4f910a018031\") " pod="keystone-kuttl-tests/keystone-db-create-zhhhm" Oct 07 13:27:21 crc kubenswrapper[4677]: I1007 13:27:21.900074 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-zhhhm" Oct 07 13:27:22 crc kubenswrapper[4677]: I1007 13:27:22.374941 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-zhhhm"] Oct 07 13:27:22 crc kubenswrapper[4677]: W1007 13:27:22.381725 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1311532b_7d0f_44bd_9e3b_4f910a018031.slice/crio-5be990ab955dcfb058bc5c207656f302f2742d550886a86e53c3d514f5dbf236 WatchSource:0}: Error finding container 5be990ab955dcfb058bc5c207656f302f2742d550886a86e53c3d514f5dbf236: Status 404 returned error can't find the container with id 5be990ab955dcfb058bc5c207656f302f2742d550886a86e53c3d514f5dbf236 Oct 07 13:27:23 crc kubenswrapper[4677]: I1007 13:27:23.319073 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d26766-ede1-405c-8838-2e9273105a22" path="/var/lib/kubelet/pods/10d26766-ede1-405c-8838-2e9273105a22/volumes" Oct 07 13:27:23 crc kubenswrapper[4677]: I1007 13:27:23.320950 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5873850b-1d72-48cb-b2d7-7e065541b888" path="/var/lib/kubelet/pods/5873850b-1d72-48cb-b2d7-7e065541b888/volumes" Oct 07 13:27:23 crc kubenswrapper[4677]: I1007 13:27:23.321859 4677 generic.go:334] "Generic (PLEG): container finished" podID="1311532b-7d0f-44bd-9e3b-4f910a018031" containerID="17387912a8ae0d2d62857729b05ff38794fa8183757efc09b0f6e842eef7fff3" exitCode=0 Oct 07 13:27:23 crc kubenswrapper[4677]: I1007 13:27:23.322032 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b345892-ea2d-4828-8981-c1a7a43c4cc5" path="/var/lib/kubelet/pods/7b345892-ea2d-4828-8981-c1a7a43c4cc5/volumes" Oct 07 13:27:23 crc kubenswrapper[4677]: I1007 13:27:23.322930 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1" path="/var/lib/kubelet/pods/efe6d5a4-3ea9-48da-8a63-ec7eca7e3fa1/volumes" Oct 07 13:27:23 crc kubenswrapper[4677]: I1007 13:27:23.324554 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-zhhhm" event={"ID":"1311532b-7d0f-44bd-9e3b-4f910a018031","Type":"ContainerDied","Data":"17387912a8ae0d2d62857729b05ff38794fa8183757efc09b0f6e842eef7fff3"} Oct 07 13:27:23 crc kubenswrapper[4677]: I1007 13:27:23.324594 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-zhhhm" event={"ID":"1311532b-7d0f-44bd-9e3b-4f910a018031","Type":"ContainerStarted","Data":"5be990ab955dcfb058bc5c207656f302f2742d550886a86e53c3d514f5dbf236"} Oct 07 13:27:24 crc kubenswrapper[4677]: I1007 13:27:24.624208 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-zhhhm" Oct 07 13:27:24 crc kubenswrapper[4677]: I1007 13:27:24.765995 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv7jq\" (UniqueName: \"kubernetes.io/projected/1311532b-7d0f-44bd-9e3b-4f910a018031-kube-api-access-zv7jq\") pod \"1311532b-7d0f-44bd-9e3b-4f910a018031\" (UID: \"1311532b-7d0f-44bd-9e3b-4f910a018031\") " Oct 07 13:27:24 crc kubenswrapper[4677]: I1007 13:27:24.773675 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1311532b-7d0f-44bd-9e3b-4f910a018031-kube-api-access-zv7jq" (OuterVolumeSpecName: "kube-api-access-zv7jq") pod "1311532b-7d0f-44bd-9e3b-4f910a018031" (UID: "1311532b-7d0f-44bd-9e3b-4f910a018031"). InnerVolumeSpecName "kube-api-access-zv7jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:27:24 crc kubenswrapper[4677]: I1007 13:27:24.868604 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv7jq\" (UniqueName: \"kubernetes.io/projected/1311532b-7d0f-44bd-9e3b-4f910a018031-kube-api-access-zv7jq\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:25 crc kubenswrapper[4677]: I1007 13:27:25.335964 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-zhhhm" event={"ID":"1311532b-7d0f-44bd-9e3b-4f910a018031","Type":"ContainerDied","Data":"5be990ab955dcfb058bc5c207656f302f2742d550886a86e53c3d514f5dbf236"} Oct 07 13:27:25 crc kubenswrapper[4677]: I1007 13:27:25.336016 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5be990ab955dcfb058bc5c207656f302f2742d550886a86e53c3d514f5dbf236" Oct 07 13:27:25 crc kubenswrapper[4677]: I1007 13:27:25.336044 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-zhhhm" Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.716514 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-9e17-account-create-bpbk4"] Oct 07 13:27:31 crc kubenswrapper[4677]: E1007 13:27:31.717322 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1311532b-7d0f-44bd-9e3b-4f910a018031" containerName="mariadb-database-create" Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.717337 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="1311532b-7d0f-44bd-9e3b-4f910a018031" containerName="mariadb-database-create" Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.717509 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="1311532b-7d0f-44bd-9e3b-4f910a018031" containerName="mariadb-database-create" Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.718007 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.720238 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.748185 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9e17-account-create-bpbk4"] Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.885906 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv5g8\" (UniqueName: \"kubernetes.io/projected/0dd04856-1b05-4718-965c-95a0e396c5d6-kube-api-access-zv5g8\") pod \"keystone-9e17-account-create-bpbk4\" (UID: \"0dd04856-1b05-4718-965c-95a0e396c5d6\") " pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" Oct 07 13:27:31 crc kubenswrapper[4677]: I1007 13:27:31.987998 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv5g8\" (UniqueName: \"kubernetes.io/projected/0dd04856-1b05-4718-965c-95a0e396c5d6-kube-api-access-zv5g8\") pod \"keystone-9e17-account-create-bpbk4\" (UID: \"0dd04856-1b05-4718-965c-95a0e396c5d6\") " pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" Oct 07 13:27:32 crc kubenswrapper[4677]: I1007 13:27:32.011480 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv5g8\" (UniqueName: \"kubernetes.io/projected/0dd04856-1b05-4718-965c-95a0e396c5d6-kube-api-access-zv5g8\") pod \"keystone-9e17-account-create-bpbk4\" (UID: \"0dd04856-1b05-4718-965c-95a0e396c5d6\") " pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" Oct 07 13:27:32 crc kubenswrapper[4677]: I1007 13:27:32.053388 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" Oct 07 13:27:32 crc kubenswrapper[4677]: I1007 13:27:32.520219 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9e17-account-create-bpbk4"] Oct 07 13:27:33 crc kubenswrapper[4677]: I1007 13:27:33.413646 4677 generic.go:334] "Generic (PLEG): container finished" podID="0dd04856-1b05-4718-965c-95a0e396c5d6" containerID="76febbdf1ccc81a3e62a38de278b17635aaf21f4031be6093e9400238f2a2c78" exitCode=0 Oct 07 13:27:33 crc kubenswrapper[4677]: I1007 13:27:33.413938 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" event={"ID":"0dd04856-1b05-4718-965c-95a0e396c5d6","Type":"ContainerDied","Data":"76febbdf1ccc81a3e62a38de278b17635aaf21f4031be6093e9400238f2a2c78"} Oct 07 13:27:33 crc kubenswrapper[4677]: I1007 13:27:33.413965 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" event={"ID":"0dd04856-1b05-4718-965c-95a0e396c5d6","Type":"ContainerStarted","Data":"b08f34a3d86778cae6d4c8fb6ce4d0df0dfccb3311c27bf577390bcbb14c4035"} Oct 07 13:27:34 crc kubenswrapper[4677]: I1007 13:27:34.652279 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" Oct 07 13:27:34 crc kubenswrapper[4677]: I1007 13:27:34.831196 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv5g8\" (UniqueName: \"kubernetes.io/projected/0dd04856-1b05-4718-965c-95a0e396c5d6-kube-api-access-zv5g8\") pod \"0dd04856-1b05-4718-965c-95a0e396c5d6\" (UID: \"0dd04856-1b05-4718-965c-95a0e396c5d6\") " Oct 07 13:27:34 crc kubenswrapper[4677]: I1007 13:27:34.840292 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd04856-1b05-4718-965c-95a0e396c5d6-kube-api-access-zv5g8" (OuterVolumeSpecName: "kube-api-access-zv5g8") pod "0dd04856-1b05-4718-965c-95a0e396c5d6" (UID: "0dd04856-1b05-4718-965c-95a0e396c5d6"). InnerVolumeSpecName "kube-api-access-zv5g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:27:34 crc kubenswrapper[4677]: I1007 13:27:34.933479 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv5g8\" (UniqueName: \"kubernetes.io/projected/0dd04856-1b05-4718-965c-95a0e396c5d6-kube-api-access-zv5g8\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:35 crc kubenswrapper[4677]: I1007 13:27:35.427772 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" event={"ID":"0dd04856-1b05-4718-965c-95a0e396c5d6","Type":"ContainerDied","Data":"b08f34a3d86778cae6d4c8fb6ce4d0df0dfccb3311c27bf577390bcbb14c4035"} Oct 07 13:27:35 crc kubenswrapper[4677]: I1007 13:27:35.427807 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b08f34a3d86778cae6d4c8fb6ce4d0df0dfccb3311c27bf577390bcbb14c4035" Oct 07 13:27:35 crc kubenswrapper[4677]: I1007 13:27:35.427822 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9e17-account-create-bpbk4" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.282456 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-76wjh"] Oct 07 13:27:37 crc kubenswrapper[4677]: E1007 13:27:37.282956 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd04856-1b05-4718-965c-95a0e396c5d6" containerName="mariadb-account-create" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.282971 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd04856-1b05-4718-965c-95a0e396c5d6" containerName="mariadb-account-create" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.283139 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd04856-1b05-4718-965c-95a0e396c5d6" containerName="mariadb-account-create" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.283634 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.286382 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-f6wgp" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.291955 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.292120 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.292174 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.297985 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-76wjh"] Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.366540 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfwwj\" (UniqueName: \"kubernetes.io/projected/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-kube-api-access-vfwwj\") pod \"keystone-db-sync-76wjh\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.366974 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-config-data\") pod \"keystone-db-sync-76wjh\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.468579 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfwwj\" (UniqueName: \"kubernetes.io/projected/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-kube-api-access-vfwwj\") pod \"keystone-db-sync-76wjh\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.469573 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-config-data\") pod \"keystone-db-sync-76wjh\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.477013 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-config-data\") pod \"keystone-db-sync-76wjh\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.489145 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfwwj\" (UniqueName: \"kubernetes.io/projected/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-kube-api-access-vfwwj\") pod \"keystone-db-sync-76wjh\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.614651 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:37 crc kubenswrapper[4677]: I1007 13:27:37.867880 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-76wjh"] Oct 07 13:27:37 crc kubenswrapper[4677]: W1007 13:27:37.880912 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57e4acb7_6ca2_4ca3_acfe_b1cb5f23917c.slice/crio-ac245a36902ab5a5acc439a1a0229dea19d6e4c0416c8d85842c8daae2f12bc2 WatchSource:0}: Error finding container ac245a36902ab5a5acc439a1a0229dea19d6e4c0416c8d85842c8daae2f12bc2: Status 404 returned error can't find the container with id ac245a36902ab5a5acc439a1a0229dea19d6e4c0416c8d85842c8daae2f12bc2 Oct 07 13:27:38 crc kubenswrapper[4677]: I1007 13:27:38.451061 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" event={"ID":"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c","Type":"ContainerStarted","Data":"d308a1b583636f423146a7c96204148801c49e5ecf3d98421cb67c20d059ab95"} Oct 07 13:27:38 crc kubenswrapper[4677]: I1007 13:27:38.451400 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" event={"ID":"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c","Type":"ContainerStarted","Data":"ac245a36902ab5a5acc439a1a0229dea19d6e4c0416c8d85842c8daae2f12bc2"} Oct 07 13:27:38 crc kubenswrapper[4677]: I1007 13:27:38.472284 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" podStartSLOduration=1.472254287 podStartE2EDuration="1.472254287s" podCreationTimestamp="2025-10-07 13:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:27:38.468277593 +0000 UTC m=+1229.953986728" watchObservedRunningTime="2025-10-07 13:27:38.472254287 +0000 UTC m=+1229.957963442" Oct 07 13:27:40 crc kubenswrapper[4677]: I1007 13:27:40.489193 4677 generic.go:334] "Generic (PLEG): container finished" podID="57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c" containerID="d308a1b583636f423146a7c96204148801c49e5ecf3d98421cb67c20d059ab95" exitCode=0 Oct 07 13:27:40 crc kubenswrapper[4677]: I1007 13:27:40.489309 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" event={"ID":"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c","Type":"ContainerDied","Data":"d308a1b583636f423146a7c96204148801c49e5ecf3d98421cb67c20d059ab95"} Oct 07 13:27:40 crc kubenswrapper[4677]: I1007 13:27:40.917839 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:27:40 crc kubenswrapper[4677]: I1007 13:27:40.917911 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:27:41 crc kubenswrapper[4677]: I1007 13:27:41.839168 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:41 crc kubenswrapper[4677]: I1007 13:27:41.936786 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfwwj\" (UniqueName: \"kubernetes.io/projected/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-kube-api-access-vfwwj\") pod \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " Oct 07 13:27:41 crc kubenswrapper[4677]: I1007 13:27:41.936845 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-config-data\") pod \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\" (UID: \"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c\") " Oct 07 13:27:41 crc kubenswrapper[4677]: I1007 13:27:41.945239 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-kube-api-access-vfwwj" (OuterVolumeSpecName: "kube-api-access-vfwwj") pod "57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c" (UID: "57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c"). InnerVolumeSpecName "kube-api-access-vfwwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:27:41 crc kubenswrapper[4677]: I1007 13:27:41.990047 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-config-data" (OuterVolumeSpecName: "config-data") pod "57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c" (UID: "57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.039239 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfwwj\" (UniqueName: \"kubernetes.io/projected/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-kube-api-access-vfwwj\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.039294 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.510168 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" event={"ID":"57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c","Type":"ContainerDied","Data":"ac245a36902ab5a5acc439a1a0229dea19d6e4c0416c8d85842c8daae2f12bc2"} Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.510724 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac245a36902ab5a5acc439a1a0229dea19d6e4c0416c8d85842c8daae2f12bc2" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.510238 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-76wjh" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.699399 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-jghjv"] Oct 07 13:27:42 crc kubenswrapper[4677]: E1007 13:27:42.699950 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c" containerName="keystone-db-sync" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.699963 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c" containerName="keystone-db-sync" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.700082 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c" containerName="keystone-db-sync" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.700499 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.709115 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.709982 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.710210 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-f6wgp" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.710740 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.722794 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-jghjv"] Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.851828 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/0aad62c8-faba-4b12-a924-19089f667587-kube-api-access-d8wp2\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.851889 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-credential-keys\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.851925 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-config-data\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.851997 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-scripts\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.852121 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-fernet-keys\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.953904 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-credential-keys\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.953976 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-config-data\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.954043 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-scripts\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.954089 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-fernet-keys\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.954141 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/0aad62c8-faba-4b12-a924-19089f667587-kube-api-access-d8wp2\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.960717 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-config-data\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.965890 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-scripts\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.966072 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-credential-keys\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.966671 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-fernet-keys\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:42 crc kubenswrapper[4677]: I1007 13:27:42.981316 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/0aad62c8-faba-4b12-a924-19089f667587-kube-api-access-d8wp2\") pod \"keystone-bootstrap-jghjv\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:43 crc kubenswrapper[4677]: I1007 13:27:43.029017 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:43 crc kubenswrapper[4677]: I1007 13:27:43.498097 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-jghjv"] Oct 07 13:27:43 crc kubenswrapper[4677]: W1007 13:27:43.508335 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aad62c8_faba_4b12_a924_19089f667587.slice/crio-2549aeae76731acaff61a7418924ca2316d89d0b07dba56ced688d17f91a6cbf WatchSource:0}: Error finding container 2549aeae76731acaff61a7418924ca2316d89d0b07dba56ced688d17f91a6cbf: Status 404 returned error can't find the container with id 2549aeae76731acaff61a7418924ca2316d89d0b07dba56ced688d17f91a6cbf Oct 07 13:27:43 crc kubenswrapper[4677]: I1007 13:27:43.522454 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" event={"ID":"0aad62c8-faba-4b12-a924-19089f667587","Type":"ContainerStarted","Data":"2549aeae76731acaff61a7418924ca2316d89d0b07dba56ced688d17f91a6cbf"} Oct 07 13:27:44 crc kubenswrapper[4677]: I1007 13:27:44.531249 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" event={"ID":"0aad62c8-faba-4b12-a924-19089f667587","Type":"ContainerStarted","Data":"00d43c02a7940f8edba2ab6223c3c3c22c2dac93d3333a1aeac76156a786a00a"} Oct 07 13:27:44 crc kubenswrapper[4677]: I1007 13:27:44.554094 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" podStartSLOduration=2.554071951 podStartE2EDuration="2.554071951s" podCreationTimestamp="2025-10-07 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:27:44.551338853 +0000 UTC m=+1236.037047968" watchObservedRunningTime="2025-10-07 13:27:44.554071951 +0000 UTC m=+1236.039781086" Oct 07 13:27:46 crc kubenswrapper[4677]: I1007 13:27:46.547698 4677 generic.go:334] "Generic (PLEG): container finished" podID="0aad62c8-faba-4b12-a924-19089f667587" containerID="00d43c02a7940f8edba2ab6223c3c3c22c2dac93d3333a1aeac76156a786a00a" exitCode=0 Oct 07 13:27:46 crc kubenswrapper[4677]: I1007 13:27:46.547797 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" event={"ID":"0aad62c8-faba-4b12-a924-19089f667587","Type":"ContainerDied","Data":"00d43c02a7940f8edba2ab6223c3c3c22c2dac93d3333a1aeac76156a786a00a"} Oct 07 13:27:47 crc kubenswrapper[4677]: I1007 13:27:47.922696 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.031908 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-scripts\") pod \"0aad62c8-faba-4b12-a924-19089f667587\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.032001 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-fernet-keys\") pod \"0aad62c8-faba-4b12-a924-19089f667587\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.032105 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/0aad62c8-faba-4b12-a924-19089f667587-kube-api-access-d8wp2\") pod \"0aad62c8-faba-4b12-a924-19089f667587\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.032174 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-credential-keys\") pod \"0aad62c8-faba-4b12-a924-19089f667587\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.032234 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-config-data\") pod \"0aad62c8-faba-4b12-a924-19089f667587\" (UID: \"0aad62c8-faba-4b12-a924-19089f667587\") " Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.040017 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aad62c8-faba-4b12-a924-19089f667587-kube-api-access-d8wp2" (OuterVolumeSpecName: "kube-api-access-d8wp2") pod "0aad62c8-faba-4b12-a924-19089f667587" (UID: "0aad62c8-faba-4b12-a924-19089f667587"). InnerVolumeSpecName "kube-api-access-d8wp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.040585 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0aad62c8-faba-4b12-a924-19089f667587" (UID: "0aad62c8-faba-4b12-a924-19089f667587"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.041001 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-scripts" (OuterVolumeSpecName: "scripts") pod "0aad62c8-faba-4b12-a924-19089f667587" (UID: "0aad62c8-faba-4b12-a924-19089f667587"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.041138 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0aad62c8-faba-4b12-a924-19089f667587" (UID: "0aad62c8-faba-4b12-a924-19089f667587"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.073066 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-config-data" (OuterVolumeSpecName: "config-data") pod "0aad62c8-faba-4b12-a924-19089f667587" (UID: "0aad62c8-faba-4b12-a924-19089f667587"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.134001 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.134055 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.134074 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.134091 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0aad62c8-faba-4b12-a924-19089f667587-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.134111 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/0aad62c8-faba-4b12-a924-19089f667587-kube-api-access-d8wp2\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.577974 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" event={"ID":"0aad62c8-faba-4b12-a924-19089f667587","Type":"ContainerDied","Data":"2549aeae76731acaff61a7418924ca2316d89d0b07dba56ced688d17f91a6cbf"} Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.578066 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2549aeae76731acaff61a7418924ca2316d89d0b07dba56ced688d17f91a6cbf" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.578236 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-jghjv" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.666917 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw"] Oct 07 13:27:48 crc kubenswrapper[4677]: E1007 13:27:48.667252 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aad62c8-faba-4b12-a924-19089f667587" containerName="keystone-bootstrap" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.667268 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aad62c8-faba-4b12-a924-19089f667587" containerName="keystone-bootstrap" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.667487 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aad62c8-faba-4b12-a924-19089f667587" containerName="keystone-bootstrap" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.668039 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.671418 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.671785 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.672809 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-f6wgp" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.672954 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.686853 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw"] Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.742513 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-credential-keys\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.742594 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-config-data\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.742680 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-scripts\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.742709 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pv57\" (UniqueName: \"kubernetes.io/projected/c4ee04d7-9071-415d-94cd-daaace73b138-kube-api-access-6pv57\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.742883 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-fernet-keys\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.844071 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-config-data\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.844163 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-scripts\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.844193 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pv57\" (UniqueName: \"kubernetes.io/projected/c4ee04d7-9071-415d-94cd-daaace73b138-kube-api-access-6pv57\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.844238 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-fernet-keys\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.844267 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-credential-keys\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.849392 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-scripts\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.849988 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-config-data\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.850599 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-credential-keys\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.854316 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-fernet-keys\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.875149 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pv57\" (UniqueName: \"kubernetes.io/projected/c4ee04d7-9071-415d-94cd-daaace73b138-kube-api-access-6pv57\") pod \"keystone-5f7d4d4854-cgrcw\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:48 crc kubenswrapper[4677]: I1007 13:27:48.983576 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:49 crc kubenswrapper[4677]: I1007 13:27:49.425921 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw"] Oct 07 13:27:49 crc kubenswrapper[4677]: I1007 13:27:49.586965 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" event={"ID":"c4ee04d7-9071-415d-94cd-daaace73b138","Type":"ContainerStarted","Data":"e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae"} Oct 07 13:27:49 crc kubenswrapper[4677]: I1007 13:27:49.587240 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:27:49 crc kubenswrapper[4677]: I1007 13:27:49.587252 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" event={"ID":"c4ee04d7-9071-415d-94cd-daaace73b138","Type":"ContainerStarted","Data":"e3ec609018d4090adc41e9c692194e95d0fd44715f28ae71bacd11ab8b136711"} Oct 07 13:27:49 crc kubenswrapper[4677]: I1007 13:27:49.602077 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" podStartSLOduration=1.6020567479999999 podStartE2EDuration="1.602056748s" podCreationTimestamp="2025-10-07 13:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:27:49.601405689 +0000 UTC m=+1241.087114834" watchObservedRunningTime="2025-10-07 13:27:49.602056748 +0000 UTC m=+1241.087765863" Oct 07 13:28:10 crc kubenswrapper[4677]: I1007 13:28:10.120315 4677 scope.go:117] "RemoveContainer" containerID="37d41ffe7a8114750e66197238c19584d3771fc02d844b75cdcb4df7324ac026" Oct 07 13:28:10 crc kubenswrapper[4677]: I1007 13:28:10.147829 4677 scope.go:117] "RemoveContainer" containerID="40844dfa58017a0eeef941d1e032645d31a457ef288d1035463cd3709434c0c6" Oct 07 13:28:10 crc kubenswrapper[4677]: I1007 13:28:10.916942 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:28:10 crc kubenswrapper[4677]: I1007 13:28:10.917313 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:28:10 crc kubenswrapper[4677]: I1007 13:28:10.917374 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:28:10 crc kubenswrapper[4677]: I1007 13:28:10.918160 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31488fc6d756c96c0843de0c3c1b89a15439285821997035e8781aaf44c08c84"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:28:10 crc kubenswrapper[4677]: I1007 13:28:10.918237 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://31488fc6d756c96c0843de0c3c1b89a15439285821997035e8781aaf44c08c84" gracePeriod=600 Oct 07 13:28:11 crc kubenswrapper[4677]: I1007 13:28:11.786846 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="31488fc6d756c96c0843de0c3c1b89a15439285821997035e8781aaf44c08c84" exitCode=0 Oct 07 13:28:11 crc kubenswrapper[4677]: I1007 13:28:11.786906 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"31488fc6d756c96c0843de0c3c1b89a15439285821997035e8781aaf44c08c84"} Oct 07 13:28:11 crc kubenswrapper[4677]: I1007 13:28:11.787266 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"ae51911f66a891e48f90a715757af601cb9a83227dd098a756b0a7910507b4bc"} Oct 07 13:28:11 crc kubenswrapper[4677]: I1007 13:28:11.787300 4677 scope.go:117] "RemoveContainer" containerID="37fd49a51a3bd5137d45d074e28b4ab8e0800f2fea4c41dcc155e9985c92e63a" Oct 07 13:28:20 crc kubenswrapper[4677]: I1007 13:28:20.401716 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.014417 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.016259 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.019170 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"default-dockercfg-sz7jm" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.019795 4677 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-config-secret" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.020730 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.033641 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.142030 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzv64\" (UniqueName: \"kubernetes.io/projected/94dc3f8c-381e-451a-abd2-802c99de6e6b-kube-api-access-gzv64\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.142096 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.142192 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.244247 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.244649 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.244725 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzv64\" (UniqueName: \"kubernetes.io/projected/94dc3f8c-381e-451a-abd2-802c99de6e6b-kube-api-access-gzv64\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.246659 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.255982 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.278900 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzv64\" (UniqueName: \"kubernetes.io/projected/94dc3f8c-381e-451a-abd2-802c99de6e6b-kube-api-access-gzv64\") pod \"openstackclient\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.352214 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.766703 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.776339 4677 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:28:21 crc kubenswrapper[4677]: I1007 13:28:21.880716 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"94dc3f8c-381e-451a-abd2-802c99de6e6b","Type":"ContainerStarted","Data":"6d649fa0e2636a1f3d9b9566048b8df712474f351abb8e3646ca73084d447b17"} Oct 07 13:28:29 crc kubenswrapper[4677]: I1007 13:28:29.944503 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"94dc3f8c-381e-451a-abd2-802c99de6e6b","Type":"ContainerStarted","Data":"34c762bd95770673880dfbe776d4e1d789ab5a131e4de939949d0cfc6517da1a"} Oct 07 13:28:29 crc kubenswrapper[4677]: I1007 13:28:29.968193 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstackclient" podStartSLOduration=2.957207852 podStartE2EDuration="9.968170124s" podCreationTimestamp="2025-10-07 13:28:20 +0000 UTC" firstStartedPulling="2025-10-07 13:28:21.77583058 +0000 UTC m=+1273.261539745" lastFinishedPulling="2025-10-07 13:28:28.786792902 +0000 UTC m=+1280.272502017" observedRunningTime="2025-10-07 13:28:29.963319225 +0000 UTC m=+1281.449028400" watchObservedRunningTime="2025-10-07 13:28:29.968170124 +0000 UTC m=+1281.453879249" Oct 07 13:29:10 crc kubenswrapper[4677]: I1007 13:29:10.226907 4677 scope.go:117] "RemoveContainer" containerID="21984cfe565a162691ddd942355e9162c94f0ef54b6859454c7da828d9d312ca" Oct 07 13:29:10 crc kubenswrapper[4677]: I1007 13:29:10.286791 4677 scope.go:117] "RemoveContainer" containerID="337a35d3da0185088efc41bb34b6262c81c262ab4264f9d1541567d20d14ab88" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.145205 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf"] Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.146628 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.150719 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.151610 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.158853 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf"] Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.312311 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h69c\" (UniqueName: \"kubernetes.io/projected/7e1f6dce-f109-4ecd-9c8d-8b7349433014-kube-api-access-7h69c\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.312358 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f6dce-f109-4ecd-9c8d-8b7349433014-secret-volume\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.312581 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f6dce-f109-4ecd-9c8d-8b7349433014-config-volume\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.414394 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h69c\" (UniqueName: \"kubernetes.io/projected/7e1f6dce-f109-4ecd-9c8d-8b7349433014-kube-api-access-7h69c\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.414523 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f6dce-f109-4ecd-9c8d-8b7349433014-secret-volume\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.414660 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f6dce-f109-4ecd-9c8d-8b7349433014-config-volume\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.416537 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f6dce-f109-4ecd-9c8d-8b7349433014-config-volume\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.430577 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f6dce-f109-4ecd-9c8d-8b7349433014-secret-volume\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.439016 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h69c\" (UniqueName: \"kubernetes.io/projected/7e1f6dce-f109-4ecd-9c8d-8b7349433014-kube-api-access-7h69c\") pod \"collect-profiles-29330730-lsbmf\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.481756 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:00 crc kubenswrapper[4677]: I1007 13:30:00.751401 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf"] Oct 07 13:30:01 crc kubenswrapper[4677]: I1007 13:30:01.714805 4677 generic.go:334] "Generic (PLEG): container finished" podID="7e1f6dce-f109-4ecd-9c8d-8b7349433014" containerID="6d68ac89fdd5565f2b0c9a77f00ebf823be3ef056129c4d091bf4ee2a450f93d" exitCode=0 Oct 07 13:30:01 crc kubenswrapper[4677]: I1007 13:30:01.714888 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" event={"ID":"7e1f6dce-f109-4ecd-9c8d-8b7349433014","Type":"ContainerDied","Data":"6d68ac89fdd5565f2b0c9a77f00ebf823be3ef056129c4d091bf4ee2a450f93d"} Oct 07 13:30:01 crc kubenswrapper[4677]: I1007 13:30:01.714914 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" event={"ID":"7e1f6dce-f109-4ecd-9c8d-8b7349433014","Type":"ContainerStarted","Data":"fb459587de257846fdf412dda4eea1aeb592226d6e8f2d3f1f799675ea2ac01b"} Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.099885 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.257776 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h69c\" (UniqueName: \"kubernetes.io/projected/7e1f6dce-f109-4ecd-9c8d-8b7349433014-kube-api-access-7h69c\") pod \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.257852 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f6dce-f109-4ecd-9c8d-8b7349433014-secret-volume\") pod \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.257942 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f6dce-f109-4ecd-9c8d-8b7349433014-config-volume\") pod \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\" (UID: \"7e1f6dce-f109-4ecd-9c8d-8b7349433014\") " Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.259001 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1f6dce-f109-4ecd-9c8d-8b7349433014-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e1f6dce-f109-4ecd-9c8d-8b7349433014" (UID: "7e1f6dce-f109-4ecd-9c8d-8b7349433014"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.264153 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1f6dce-f109-4ecd-9c8d-8b7349433014-kube-api-access-7h69c" (OuterVolumeSpecName: "kube-api-access-7h69c") pod "7e1f6dce-f109-4ecd-9c8d-8b7349433014" (UID: "7e1f6dce-f109-4ecd-9c8d-8b7349433014"). InnerVolumeSpecName "kube-api-access-7h69c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.267839 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1f6dce-f109-4ecd-9c8d-8b7349433014-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e1f6dce-f109-4ecd-9c8d-8b7349433014" (UID: "7e1f6dce-f109-4ecd-9c8d-8b7349433014"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.360289 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h69c\" (UniqueName: \"kubernetes.io/projected/7e1f6dce-f109-4ecd-9c8d-8b7349433014-kube-api-access-7h69c\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.360328 4677 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f6dce-f109-4ecd-9c8d-8b7349433014-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.360340 4677 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f6dce-f109-4ecd-9c8d-8b7349433014-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.736203 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" event={"ID":"7e1f6dce-f109-4ecd-9c8d-8b7349433014","Type":"ContainerDied","Data":"fb459587de257846fdf412dda4eea1aeb592226d6e8f2d3f1f799675ea2ac01b"} Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.736239 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb459587de257846fdf412dda4eea1aeb592226d6e8f2d3f1f799675ea2ac01b" Oct 07 13:30:03 crc kubenswrapper[4677]: I1007 13:30:03.736290 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-lsbmf" Oct 07 13:30:10 crc kubenswrapper[4677]: I1007 13:30:10.386454 4677 scope.go:117] "RemoveContainer" containerID="2a0c4af1ac9fb3f088677e28d3185c071f724652ef171a1118d32854c56a1145" Oct 07 13:30:10 crc kubenswrapper[4677]: I1007 13:30:10.419074 4677 scope.go:117] "RemoveContainer" containerID="4cbee337be6cd9c277d978021e0d03671aaa66539756468a77d43a7692557f7d" Oct 07 13:30:10 crc kubenswrapper[4677]: I1007 13:30:10.467105 4677 scope.go:117] "RemoveContainer" containerID="0e85b2bd38e151eb42702e923d86e3e5a3f33f1820e387b717d1eda5bdacf085" Oct 07 13:30:10 crc kubenswrapper[4677]: I1007 13:30:10.490762 4677 scope.go:117] "RemoveContainer" containerID="8a9ea4fad0a2e53abface3155349591856e21236e9fea2c57369c18036a22bca" Oct 07 13:30:40 crc kubenswrapper[4677]: I1007 13:30:40.917269 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:30:40 crc kubenswrapper[4677]: I1007 13:30:40.918172 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.431128 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kwl46"] Oct 07 13:30:52 crc kubenswrapper[4677]: E1007 13:30:52.432327 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1f6dce-f109-4ecd-9c8d-8b7349433014" containerName="collect-profiles" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.432355 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1f6dce-f109-4ecd-9c8d-8b7349433014" containerName="collect-profiles" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.432678 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1f6dce-f109-4ecd-9c8d-8b7349433014" containerName="collect-profiles" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.448106 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.495349 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwl46"] Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.564646 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgcxs\" (UniqueName: \"kubernetes.io/projected/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-kube-api-access-cgcxs\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.564700 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-utilities\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.564732 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-catalog-content\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.665861 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgcxs\" (UniqueName: \"kubernetes.io/projected/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-kube-api-access-cgcxs\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.665922 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-utilities\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.665955 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-catalog-content\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.666728 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-utilities\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.666777 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-catalog-content\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.690359 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgcxs\" (UniqueName: \"kubernetes.io/projected/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-kube-api-access-cgcxs\") pod \"community-operators-kwl46\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:52 crc kubenswrapper[4677]: I1007 13:30:52.795838 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:30:53 crc kubenswrapper[4677]: I1007 13:30:53.035713 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwl46"] Oct 07 13:30:53 crc kubenswrapper[4677]: I1007 13:30:53.191752 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwl46" event={"ID":"38ea31c1-c9de-4b72-9cbe-b25e7c848c77","Type":"ContainerStarted","Data":"760ba6dc2891f05ba7caf3e4509f31fe4ed8bdadd34a1571d1cf2f0af9e288ca"} Oct 07 13:30:54 crc kubenswrapper[4677]: I1007 13:30:54.203235 4677 generic.go:334] "Generic (PLEG): container finished" podID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerID="abdc043dcf6a1ffaa9a808c0e0490f119a05296b8abd1f5881dc8b85cbacae93" exitCode=0 Oct 07 13:30:54 crc kubenswrapper[4677]: I1007 13:30:54.203290 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwl46" event={"ID":"38ea31c1-c9de-4b72-9cbe-b25e7c848c77","Type":"ContainerDied","Data":"abdc043dcf6a1ffaa9a808c0e0490f119a05296b8abd1f5881dc8b85cbacae93"} Oct 07 13:30:56 crc kubenswrapper[4677]: I1007 13:30:56.224746 4677 generic.go:334] "Generic (PLEG): container finished" podID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerID="ff9dcff9b26777091b6ba2a93b45967e8d0afdbf66f4cc95fa57967ace11567c" exitCode=0 Oct 07 13:30:56 crc kubenswrapper[4677]: I1007 13:30:56.224805 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwl46" event={"ID":"38ea31c1-c9de-4b72-9cbe-b25e7c848c77","Type":"ContainerDied","Data":"ff9dcff9b26777091b6ba2a93b45967e8d0afdbf66f4cc95fa57967ace11567c"} Oct 07 13:30:57 crc kubenswrapper[4677]: I1007 13:30:57.252763 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwl46" event={"ID":"38ea31c1-c9de-4b72-9cbe-b25e7c848c77","Type":"ContainerStarted","Data":"2363b6c335f95fd529a17ffaedb4f62a9eb74dde40bd512a2aae01de569d28bd"} Oct 07 13:30:57 crc kubenswrapper[4677]: I1007 13:30:57.282979 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kwl46" podStartSLOduration=2.486132929 podStartE2EDuration="5.28295265s" podCreationTimestamp="2025-10-07 13:30:52 +0000 UTC" firstStartedPulling="2025-10-07 13:30:54.205114893 +0000 UTC m=+1425.690824048" lastFinishedPulling="2025-10-07 13:30:57.001934644 +0000 UTC m=+1428.487643769" observedRunningTime="2025-10-07 13:30:57.27773436 +0000 UTC m=+1428.763443545" watchObservedRunningTime="2025-10-07 13:30:57.28295265 +0000 UTC m=+1428.768661785" Oct 07 13:31:02 crc kubenswrapper[4677]: I1007 13:31:02.796377 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:31:02 crc kubenswrapper[4677]: I1007 13:31:02.798646 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:31:02 crc kubenswrapper[4677]: I1007 13:31:02.855047 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:31:03 crc kubenswrapper[4677]: I1007 13:31:03.369625 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:31:03 crc kubenswrapper[4677]: I1007 13:31:03.441710 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwl46"] Oct 07 13:31:05 crc kubenswrapper[4677]: I1007 13:31:05.316385 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kwl46" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="registry-server" containerID="cri-o://2363b6c335f95fd529a17ffaedb4f62a9eb74dde40bd512a2aae01de569d28bd" gracePeriod=2 Oct 07 13:31:06 crc kubenswrapper[4677]: I1007 13:31:06.330294 4677 generic.go:334] "Generic (PLEG): container finished" podID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerID="2363b6c335f95fd529a17ffaedb4f62a9eb74dde40bd512a2aae01de569d28bd" exitCode=0 Oct 07 13:31:06 crc kubenswrapper[4677]: I1007 13:31:06.330742 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwl46" event={"ID":"38ea31c1-c9de-4b72-9cbe-b25e7c848c77","Type":"ContainerDied","Data":"2363b6c335f95fd529a17ffaedb4f62a9eb74dde40bd512a2aae01de569d28bd"} Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.540009 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.717858 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgcxs\" (UniqueName: \"kubernetes.io/projected/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-kube-api-access-cgcxs\") pod \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.717939 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-utilities\") pod \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.718172 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-catalog-content\") pod \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\" (UID: \"38ea31c1-c9de-4b72-9cbe-b25e7c848c77\") " Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.720583 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-utilities" (OuterVolumeSpecName: "utilities") pod "38ea31c1-c9de-4b72-9cbe-b25e7c848c77" (UID: "38ea31c1-c9de-4b72-9cbe-b25e7c848c77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.730710 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-kube-api-access-cgcxs" (OuterVolumeSpecName: "kube-api-access-cgcxs") pod "38ea31c1-c9de-4b72-9cbe-b25e7c848c77" (UID: "38ea31c1-c9de-4b72-9cbe-b25e7c848c77"). InnerVolumeSpecName "kube-api-access-cgcxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.801003 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38ea31c1-c9de-4b72-9cbe-b25e7c848c77" (UID: "38ea31c1-c9de-4b72-9cbe-b25e7c848c77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.820861 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.820907 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgcxs\" (UniqueName: \"kubernetes.io/projected/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-kube-api-access-cgcxs\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:07 crc kubenswrapper[4677]: I1007 13:31:07.820933 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38ea31c1-c9de-4b72-9cbe-b25e7c848c77-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:08 crc kubenswrapper[4677]: I1007 13:31:08.360514 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwl46" event={"ID":"38ea31c1-c9de-4b72-9cbe-b25e7c848c77","Type":"ContainerDied","Data":"760ba6dc2891f05ba7caf3e4509f31fe4ed8bdadd34a1571d1cf2f0af9e288ca"} Oct 07 13:31:08 crc kubenswrapper[4677]: I1007 13:31:08.360608 4677 scope.go:117] "RemoveContainer" containerID="2363b6c335f95fd529a17ffaedb4f62a9eb74dde40bd512a2aae01de569d28bd" Oct 07 13:31:08 crc kubenswrapper[4677]: I1007 13:31:08.360703 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwl46" Oct 07 13:31:08 crc kubenswrapper[4677]: I1007 13:31:08.394966 4677 scope.go:117] "RemoveContainer" containerID="ff9dcff9b26777091b6ba2a93b45967e8d0afdbf66f4cc95fa57967ace11567c" Oct 07 13:31:08 crc kubenswrapper[4677]: I1007 13:31:08.406279 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwl46"] Oct 07 13:31:08 crc kubenswrapper[4677]: I1007 13:31:08.412048 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kwl46"] Oct 07 13:31:08 crc kubenswrapper[4677]: I1007 13:31:08.439907 4677 scope.go:117] "RemoveContainer" containerID="abdc043dcf6a1ffaa9a808c0e0490f119a05296b8abd1f5881dc8b85cbacae93" Oct 07 13:31:09 crc kubenswrapper[4677]: I1007 13:31:09.318112 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" path="/var/lib/kubelet/pods/38ea31c1-c9de-4b72-9cbe-b25e7c848c77/volumes" Oct 07 13:31:10 crc kubenswrapper[4677]: I1007 13:31:10.575312 4677 scope.go:117] "RemoveContainer" containerID="81a9e406509d8a147436efcbc4e859895d1103232c9f2a2d0503ae75c417f9eb" Oct 07 13:31:10 crc kubenswrapper[4677]: I1007 13:31:10.594847 4677 scope.go:117] "RemoveContainer" containerID="65718cb46256c82b8f0772670966bf56a32bb58ee7f882ac94471e9c23a98984" Oct 07 13:31:10 crc kubenswrapper[4677]: I1007 13:31:10.615463 4677 scope.go:117] "RemoveContainer" containerID="3f929f62fdb7df361d82c388ebb48844dcac0c50fb2ad8b525a6f6e0fe940e49" Oct 07 13:31:10 crc kubenswrapper[4677]: I1007 13:31:10.917817 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:31:10 crc kubenswrapper[4677]: I1007 13:31:10.917909 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.483763 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d7r4d"] Oct 07 13:31:21 crc kubenswrapper[4677]: E1007 13:31:21.484502 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="registry-server" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.484517 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="registry-server" Oct 07 13:31:21 crc kubenswrapper[4677]: E1007 13:31:21.484536 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="extract-utilities" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.484543 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="extract-utilities" Oct 07 13:31:21 crc kubenswrapper[4677]: E1007 13:31:21.484553 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="extract-content" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.484560 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="extract-content" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.484715 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ea31c1-c9de-4b72-9cbe-b25e7c848c77" containerName="registry-server" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.485733 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.552113 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7r4d"] Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.628640 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4fdd\" (UniqueName: \"kubernetes.io/projected/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-kube-api-access-d4fdd\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.628702 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-catalog-content\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.628761 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-utilities\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.730453 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4fdd\" (UniqueName: \"kubernetes.io/projected/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-kube-api-access-d4fdd\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.730500 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-catalog-content\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.730556 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-utilities\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.731101 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-utilities\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.731101 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-catalog-content\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.770561 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4fdd\" (UniqueName: \"kubernetes.io/projected/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-kube-api-access-d4fdd\") pod \"redhat-operators-d7r4d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:21 crc kubenswrapper[4677]: I1007 13:31:21.831258 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:22 crc kubenswrapper[4677]: I1007 13:31:22.039126 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7r4d"] Oct 07 13:31:22 crc kubenswrapper[4677]: I1007 13:31:22.483886 4677 generic.go:334] "Generic (PLEG): container finished" podID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerID="4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429" exitCode=0 Oct 07 13:31:22 crc kubenswrapper[4677]: I1007 13:31:22.483935 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7r4d" event={"ID":"7463a8a1-ce11-48dc-9896-f70e73a4ef6d","Type":"ContainerDied","Data":"4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429"} Oct 07 13:31:22 crc kubenswrapper[4677]: I1007 13:31:22.484183 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7r4d" event={"ID":"7463a8a1-ce11-48dc-9896-f70e73a4ef6d","Type":"ContainerStarted","Data":"bc4e1b3389b58c804f81837f25c5464fed3e77b2b299e9d455613d47bad64ba8"} Oct 07 13:31:24 crc kubenswrapper[4677]: I1007 13:31:24.500246 4677 generic.go:334] "Generic (PLEG): container finished" podID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerID="d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32" exitCode=0 Oct 07 13:31:24 crc kubenswrapper[4677]: I1007 13:31:24.500370 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7r4d" event={"ID":"7463a8a1-ce11-48dc-9896-f70e73a4ef6d","Type":"ContainerDied","Data":"d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32"} Oct 07 13:31:25 crc kubenswrapper[4677]: I1007 13:31:25.516047 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7r4d" event={"ID":"7463a8a1-ce11-48dc-9896-f70e73a4ef6d","Type":"ContainerStarted","Data":"ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff"} Oct 07 13:31:25 crc kubenswrapper[4677]: I1007 13:31:25.544974 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d7r4d" podStartSLOduration=2.027307964 podStartE2EDuration="4.544947772s" podCreationTimestamp="2025-10-07 13:31:21 +0000 UTC" firstStartedPulling="2025-10-07 13:31:22.485210887 +0000 UTC m=+1453.970920002" lastFinishedPulling="2025-10-07 13:31:25.002850695 +0000 UTC m=+1456.488559810" observedRunningTime="2025-10-07 13:31:25.543565882 +0000 UTC m=+1457.029275097" watchObservedRunningTime="2025-10-07 13:31:25.544947772 +0000 UTC m=+1457.030656937" Oct 07 13:31:31 crc kubenswrapper[4677]: I1007 13:31:31.832145 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:31 crc kubenswrapper[4677]: I1007 13:31:31.832845 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:31 crc kubenswrapper[4677]: I1007 13:31:31.884126 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:32 crc kubenswrapper[4677]: I1007 13:31:32.632908 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:32 crc kubenswrapper[4677]: I1007 13:31:32.691966 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7r4d"] Oct 07 13:31:34 crc kubenswrapper[4677]: I1007 13:31:34.591758 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d7r4d" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="registry-server" containerID="cri-o://ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff" gracePeriod=2 Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.591589 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.600315 4677 generic.go:334] "Generic (PLEG): container finished" podID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerID="ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff" exitCode=0 Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.600370 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7r4d" event={"ID":"7463a8a1-ce11-48dc-9896-f70e73a4ef6d","Type":"ContainerDied","Data":"ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff"} Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.600383 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7r4d" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.600467 4677 scope.go:117] "RemoveContainer" containerID="ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.600450 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7r4d" event={"ID":"7463a8a1-ce11-48dc-9896-f70e73a4ef6d","Type":"ContainerDied","Data":"bc4e1b3389b58c804f81837f25c5464fed3e77b2b299e9d455613d47bad64ba8"} Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.623641 4677 scope.go:117] "RemoveContainer" containerID="d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.641078 4677 scope.go:117] "RemoveContainer" containerID="4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.662336 4677 scope.go:117] "RemoveContainer" containerID="ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff" Oct 07 13:31:35 crc kubenswrapper[4677]: E1007 13:31:35.663959 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff\": container with ID starting with ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff not found: ID does not exist" containerID="ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.664043 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff"} err="failed to get container status \"ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff\": rpc error: code = NotFound desc = could not find container \"ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff\": container with ID starting with ce305c6213f1338fbf1108069456c65c321254eea0d5db5b926374ddd2cf2fff not found: ID does not exist" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.664094 4677 scope.go:117] "RemoveContainer" containerID="d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32" Oct 07 13:31:35 crc kubenswrapper[4677]: E1007 13:31:35.664800 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32\": container with ID starting with d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32 not found: ID does not exist" containerID="d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.665150 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32"} err="failed to get container status \"d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32\": rpc error: code = NotFound desc = could not find container \"d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32\": container with ID starting with d54c3bd649e0820cb09ba583716e2b97c8887b028182180bf66b6eea5b0e1e32 not found: ID does not exist" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.665180 4677 scope.go:117] "RemoveContainer" containerID="4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429" Oct 07 13:31:35 crc kubenswrapper[4677]: E1007 13:31:35.666294 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429\": container with ID starting with 4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429 not found: ID does not exist" containerID="4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.666333 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429"} err="failed to get container status \"4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429\": rpc error: code = NotFound desc = could not find container \"4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429\": container with ID starting with 4a6c11955c3c15a8c9e3a33bce147a46df6fcaadb0557e8179e0ff5999a9b429 not found: ID does not exist" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.767003 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-catalog-content\") pod \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.767170 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-utilities\") pod \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.767306 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4fdd\" (UniqueName: \"kubernetes.io/projected/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-kube-api-access-d4fdd\") pod \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\" (UID: \"7463a8a1-ce11-48dc-9896-f70e73a4ef6d\") " Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.768270 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-utilities" (OuterVolumeSpecName: "utilities") pod "7463a8a1-ce11-48dc-9896-f70e73a4ef6d" (UID: "7463a8a1-ce11-48dc-9896-f70e73a4ef6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.776726 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-kube-api-access-d4fdd" (OuterVolumeSpecName: "kube-api-access-d4fdd") pod "7463a8a1-ce11-48dc-9896-f70e73a4ef6d" (UID: "7463a8a1-ce11-48dc-9896-f70e73a4ef6d"). InnerVolumeSpecName "kube-api-access-d4fdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.861380 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7463a8a1-ce11-48dc-9896-f70e73a4ef6d" (UID: "7463a8a1-ce11-48dc-9896-f70e73a4ef6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.868844 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.868888 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.868906 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4fdd\" (UniqueName: \"kubernetes.io/projected/7463a8a1-ce11-48dc-9896-f70e73a4ef6d-kube-api-access-d4fdd\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.933447 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7r4d"] Oct 07 13:31:35 crc kubenswrapper[4677]: I1007 13:31:35.939091 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d7r4d"] Oct 07 13:31:37 crc kubenswrapper[4677]: I1007 13:31:37.318459 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" path="/var/lib/kubelet/pods/7463a8a1-ce11-48dc-9896-f70e73a4ef6d/volumes" Oct 07 13:31:40 crc kubenswrapper[4677]: I1007 13:31:40.917611 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:31:40 crc kubenswrapper[4677]: I1007 13:31:40.918376 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:31:40 crc kubenswrapper[4677]: I1007 13:31:40.918571 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:31:40 crc kubenswrapper[4677]: I1007 13:31:40.920470 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae51911f66a891e48f90a715757af601cb9a83227dd098a756b0a7910507b4bc"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:31:40 crc kubenswrapper[4677]: I1007 13:31:40.920594 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://ae51911f66a891e48f90a715757af601cb9a83227dd098a756b0a7910507b4bc" gracePeriod=600 Oct 07 13:31:41 crc kubenswrapper[4677]: I1007 13:31:41.664891 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="ae51911f66a891e48f90a715757af601cb9a83227dd098a756b0a7910507b4bc" exitCode=0 Oct 07 13:31:41 crc kubenswrapper[4677]: I1007 13:31:41.664960 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"ae51911f66a891e48f90a715757af601cb9a83227dd098a756b0a7910507b4bc"} Oct 07 13:31:41 crc kubenswrapper[4677]: I1007 13:31:41.665507 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b"} Oct 07 13:31:41 crc kubenswrapper[4677]: I1007 13:31:41.665582 4677 scope.go:117] "RemoveContainer" containerID="31488fc6d756c96c0843de0c3c1b89a15439285821997035e8781aaf44c08c84" Oct 07 13:31:44 crc kubenswrapper[4677]: I1007 13:31:44.957930 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5pndx"] Oct 07 13:31:44 crc kubenswrapper[4677]: E1007 13:31:44.958717 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="registry-server" Oct 07 13:31:44 crc kubenswrapper[4677]: I1007 13:31:44.958732 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="registry-server" Oct 07 13:31:44 crc kubenswrapper[4677]: E1007 13:31:44.958749 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="extract-utilities" Oct 07 13:31:44 crc kubenswrapper[4677]: I1007 13:31:44.958756 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="extract-utilities" Oct 07 13:31:44 crc kubenswrapper[4677]: E1007 13:31:44.958785 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="extract-content" Oct 07 13:31:44 crc kubenswrapper[4677]: I1007 13:31:44.958792 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="extract-content" Oct 07 13:31:44 crc kubenswrapper[4677]: I1007 13:31:44.958918 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="7463a8a1-ce11-48dc-9896-f70e73a4ef6d" containerName="registry-server" Oct 07 13:31:44 crc kubenswrapper[4677]: I1007 13:31:44.959982 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:44 crc kubenswrapper[4677]: I1007 13:31:44.972926 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5pndx"] Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.104863 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-catalog-content\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.105081 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-utilities\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.105155 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpcxw\" (UniqueName: \"kubernetes.io/projected/7e087cb3-1474-4e09-942c-69987f0fcc6a-kube-api-access-jpcxw\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.206687 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-utilities\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.206752 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpcxw\" (UniqueName: \"kubernetes.io/projected/7e087cb3-1474-4e09-942c-69987f0fcc6a-kube-api-access-jpcxw\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.206857 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-catalog-content\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.207499 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-catalog-content\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.207590 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-utilities\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.228927 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpcxw\" (UniqueName: \"kubernetes.io/projected/7e087cb3-1474-4e09-942c-69987f0fcc6a-kube-api-access-jpcxw\") pod \"certified-operators-5pndx\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.302808 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:45 crc kubenswrapper[4677]: I1007 13:31:45.745297 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5pndx"] Oct 07 13:31:45 crc kubenswrapper[4677]: W1007 13:31:45.759845 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e087cb3_1474_4e09_942c_69987f0fcc6a.slice/crio-e89f90f0b778be0ef1e28249d615990147f47514fa67e020838235c3220e04d3 WatchSource:0}: Error finding container e89f90f0b778be0ef1e28249d615990147f47514fa67e020838235c3220e04d3: Status 404 returned error can't find the container with id e89f90f0b778be0ef1e28249d615990147f47514fa67e020838235c3220e04d3 Oct 07 13:31:46 crc kubenswrapper[4677]: I1007 13:31:46.705206 4677 generic.go:334] "Generic (PLEG): container finished" podID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerID="65f8c69ccb3c3150fcb3d901ec0a962b8a6350c97a8817a370feab727a39ddf5" exitCode=0 Oct 07 13:31:46 crc kubenswrapper[4677]: I1007 13:31:46.705306 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pndx" event={"ID":"7e087cb3-1474-4e09-942c-69987f0fcc6a","Type":"ContainerDied","Data":"65f8c69ccb3c3150fcb3d901ec0a962b8a6350c97a8817a370feab727a39ddf5"} Oct 07 13:31:46 crc kubenswrapper[4677]: I1007 13:31:46.705493 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pndx" event={"ID":"7e087cb3-1474-4e09-942c-69987f0fcc6a","Type":"ContainerStarted","Data":"e89f90f0b778be0ef1e28249d615990147f47514fa67e020838235c3220e04d3"} Oct 07 13:31:48 crc kubenswrapper[4677]: I1007 13:31:48.718780 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pndx" event={"ID":"7e087cb3-1474-4e09-942c-69987f0fcc6a","Type":"ContainerStarted","Data":"8bc9d5b8fef79f2142767269d61a471c7aedb81b9c8b3d8ca6f525d0bd655d8f"} Oct 07 13:31:49 crc kubenswrapper[4677]: I1007 13:31:49.731595 4677 generic.go:334] "Generic (PLEG): container finished" podID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerID="8bc9d5b8fef79f2142767269d61a471c7aedb81b9c8b3d8ca6f525d0bd655d8f" exitCode=0 Oct 07 13:31:49 crc kubenswrapper[4677]: I1007 13:31:49.731675 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pndx" event={"ID":"7e087cb3-1474-4e09-942c-69987f0fcc6a","Type":"ContainerDied","Data":"8bc9d5b8fef79f2142767269d61a471c7aedb81b9c8b3d8ca6f525d0bd655d8f"} Oct 07 13:31:51 crc kubenswrapper[4677]: I1007 13:31:51.751344 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pndx" event={"ID":"7e087cb3-1474-4e09-942c-69987f0fcc6a","Type":"ContainerStarted","Data":"c90336b23465b60d5cd3a0f5c3af021e1b525fdc8f9aeda223ae3e5931f4e891"} Oct 07 13:31:51 crc kubenswrapper[4677]: I1007 13:31:51.775264 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5pndx" podStartSLOduration=3.8931494 podStartE2EDuration="7.775240065s" podCreationTimestamp="2025-10-07 13:31:44 +0000 UTC" firstStartedPulling="2025-10-07 13:31:46.706702061 +0000 UTC m=+1478.192411176" lastFinishedPulling="2025-10-07 13:31:50.588792726 +0000 UTC m=+1482.074501841" observedRunningTime="2025-10-07 13:31:51.774337909 +0000 UTC m=+1483.260047024" watchObservedRunningTime="2025-10-07 13:31:51.775240065 +0000 UTC m=+1483.260949180" Oct 07 13:31:55 crc kubenswrapper[4677]: I1007 13:31:55.312417 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:55 crc kubenswrapper[4677]: I1007 13:31:55.312898 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:55 crc kubenswrapper[4677]: I1007 13:31:55.355394 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:55 crc kubenswrapper[4677]: I1007 13:31:55.859742 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:55 crc kubenswrapper[4677]: I1007 13:31:55.912729 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5pndx"] Oct 07 13:31:57 crc kubenswrapper[4677]: I1007 13:31:57.816213 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5pndx" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="registry-server" containerID="cri-o://c90336b23465b60d5cd3a0f5c3af021e1b525fdc8f9aeda223ae3e5931f4e891" gracePeriod=2 Oct 07 13:31:58 crc kubenswrapper[4677]: I1007 13:31:58.844913 4677 generic.go:334] "Generic (PLEG): container finished" podID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerID="c90336b23465b60d5cd3a0f5c3af021e1b525fdc8f9aeda223ae3e5931f4e891" exitCode=0 Oct 07 13:31:58 crc kubenswrapper[4677]: I1007 13:31:58.845124 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pndx" event={"ID":"7e087cb3-1474-4e09-942c-69987f0fcc6a","Type":"ContainerDied","Data":"c90336b23465b60d5cd3a0f5c3af021e1b525fdc8f9aeda223ae3e5931f4e891"} Oct 07 13:31:58 crc kubenswrapper[4677]: I1007 13:31:58.937252 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.045486 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-catalog-content\") pod \"7e087cb3-1474-4e09-942c-69987f0fcc6a\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.045533 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-utilities\") pod \"7e087cb3-1474-4e09-942c-69987f0fcc6a\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.045603 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpcxw\" (UniqueName: \"kubernetes.io/projected/7e087cb3-1474-4e09-942c-69987f0fcc6a-kube-api-access-jpcxw\") pod \"7e087cb3-1474-4e09-942c-69987f0fcc6a\" (UID: \"7e087cb3-1474-4e09-942c-69987f0fcc6a\") " Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.046888 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-utilities" (OuterVolumeSpecName: "utilities") pod "7e087cb3-1474-4e09-942c-69987f0fcc6a" (UID: "7e087cb3-1474-4e09-942c-69987f0fcc6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.051925 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e087cb3-1474-4e09-942c-69987f0fcc6a-kube-api-access-jpcxw" (OuterVolumeSpecName: "kube-api-access-jpcxw") pod "7e087cb3-1474-4e09-942c-69987f0fcc6a" (UID: "7e087cb3-1474-4e09-942c-69987f0fcc6a"). InnerVolumeSpecName "kube-api-access-jpcxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.110292 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e087cb3-1474-4e09-942c-69987f0fcc6a" (UID: "7e087cb3-1474-4e09-942c-69987f0fcc6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.146847 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpcxw\" (UniqueName: \"kubernetes.io/projected/7e087cb3-1474-4e09-942c-69987f0fcc6a-kube-api-access-jpcxw\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.146929 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.146943 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e087cb3-1474-4e09-942c-69987f0fcc6a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.852264 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pndx" event={"ID":"7e087cb3-1474-4e09-942c-69987f0fcc6a","Type":"ContainerDied","Data":"e89f90f0b778be0ef1e28249d615990147f47514fa67e020838235c3220e04d3"} Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.852636 4677 scope.go:117] "RemoveContainer" containerID="c90336b23465b60d5cd3a0f5c3af021e1b525fdc8f9aeda223ae3e5931f4e891" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.852322 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pndx" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.880129 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5pndx"] Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.886025 4677 scope.go:117] "RemoveContainer" containerID="8bc9d5b8fef79f2142767269d61a471c7aedb81b9c8b3d8ca6f525d0bd655d8f" Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.886229 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5pndx"] Oct 07 13:31:59 crc kubenswrapper[4677]: I1007 13:31:59.905578 4677 scope.go:117] "RemoveContainer" containerID="65f8c69ccb3c3150fcb3d901ec0a962b8a6350c97a8817a370feab727a39ddf5" Oct 07 13:32:01 crc kubenswrapper[4677]: I1007 13:32:01.311823 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" path="/var/lib/kubelet/pods/7e087cb3-1474-4e09-942c-69987f0fcc6a/volumes" Oct 07 13:32:10 crc kubenswrapper[4677]: I1007 13:32:10.696232 4677 scope.go:117] "RemoveContainer" containerID="4e147f80d89000dd8d28c33f3913c5594ffcd6e05509c6b6bfc82176abcc9c3a" Oct 07 13:32:10 crc kubenswrapper[4677]: I1007 13:32:10.715932 4677 scope.go:117] "RemoveContainer" containerID="42be7a25f38277e2f0f91260a907388c468e63e3d2f057825c401b0019340f9a" Oct 07 13:32:10 crc kubenswrapper[4677]: I1007 13:32:10.739052 4677 scope.go:117] "RemoveContainer" containerID="262af98a53197751da982603299ca6f3eec94b3f52ea708f5680dfb2dc50bf07" Oct 07 13:32:10 crc kubenswrapper[4677]: I1007 13:32:10.759751 4677 scope.go:117] "RemoveContainer" containerID="93f40180a44d6d1d31ea20f392e51cd8cec89e02b0e0d6a08a0e11af7f1f30e8" Oct 07 13:32:10 crc kubenswrapper[4677]: I1007 13:32:10.812398 4677 scope.go:117] "RemoveContainer" containerID="4d4cb2fd8469c6d6185c8a8996a9e39decdb470b52abb9be938bf38be38d6c6a" Oct 07 13:33:10 crc kubenswrapper[4677]: I1007 13:33:10.912186 4677 scope.go:117] "RemoveContainer" containerID="cf6d3dd7ac8321d14de7d9be095d826eee31026829d19d639d8e4ff79def663e" Oct 07 13:33:10 crc kubenswrapper[4677]: I1007 13:33:10.966138 4677 scope.go:117] "RemoveContainer" containerID="711d70acf68909e2e4ff8981d2aa59365aefab24e76c2d9b01eee732ffc53e78" Oct 07 13:34:10 crc kubenswrapper[4677]: I1007 13:34:10.917401 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:34:10 crc kubenswrapper[4677]: I1007 13:34:10.918050 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:34:11 crc kubenswrapper[4677]: I1007 13:34:11.072636 4677 scope.go:117] "RemoveContainer" containerID="381198013d79ab3c7228dac947aa677060b49de0637755b01d97c879f2afc651" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.828759 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bw788"] Oct 07 13:34:14 crc kubenswrapper[4677]: E1007 13:34:14.829775 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="registry-server" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.829800 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="registry-server" Oct 07 13:34:14 crc kubenswrapper[4677]: E1007 13:34:14.829838 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="extract-utilities" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.829851 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="extract-utilities" Oct 07 13:34:14 crc kubenswrapper[4677]: E1007 13:34:14.829873 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="extract-content" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.829887 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="extract-content" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.830129 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e087cb3-1474-4e09-942c-69987f0fcc6a" containerName="registry-server" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.831826 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.843894 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw788"] Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.864027 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-utilities\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.864091 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-catalog-content\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.864124 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqf7m\" (UniqueName: \"kubernetes.io/projected/5bce5223-6054-425b-8e83-888bb6794776-kube-api-access-rqf7m\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.965198 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-utilities\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.965271 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-catalog-content\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.965305 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqf7m\" (UniqueName: \"kubernetes.io/projected/5bce5223-6054-425b-8e83-888bb6794776-kube-api-access-rqf7m\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.965722 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-utilities\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.965772 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-catalog-content\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:14 crc kubenswrapper[4677]: I1007 13:34:14.984150 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqf7m\" (UniqueName: \"kubernetes.io/projected/5bce5223-6054-425b-8e83-888bb6794776-kube-api-access-rqf7m\") pod \"redhat-marketplace-bw788\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:15 crc kubenswrapper[4677]: I1007 13:34:15.154443 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:15 crc kubenswrapper[4677]: I1007 13:34:15.351498 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw788"] Oct 07 13:34:15 crc kubenswrapper[4677]: W1007 13:34:15.353394 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bce5223_6054_425b_8e83_888bb6794776.slice/crio-2c5c0821a5488e671410a8dececc28de4a53ad07da2114b52f59f264416d4cd5 WatchSource:0}: Error finding container 2c5c0821a5488e671410a8dececc28de4a53ad07da2114b52f59f264416d4cd5: Status 404 returned error can't find the container with id 2c5c0821a5488e671410a8dececc28de4a53ad07da2114b52f59f264416d4cd5 Oct 07 13:34:15 crc kubenswrapper[4677]: I1007 13:34:15.951708 4677 generic.go:334] "Generic (PLEG): container finished" podID="5bce5223-6054-425b-8e83-888bb6794776" containerID="114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be" exitCode=0 Oct 07 13:34:15 crc kubenswrapper[4677]: I1007 13:34:15.951791 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw788" event={"ID":"5bce5223-6054-425b-8e83-888bb6794776","Type":"ContainerDied","Data":"114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be"} Oct 07 13:34:15 crc kubenswrapper[4677]: I1007 13:34:15.952015 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw788" event={"ID":"5bce5223-6054-425b-8e83-888bb6794776","Type":"ContainerStarted","Data":"2c5c0821a5488e671410a8dececc28de4a53ad07da2114b52f59f264416d4cd5"} Oct 07 13:34:15 crc kubenswrapper[4677]: I1007 13:34:15.954997 4677 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:34:17 crc kubenswrapper[4677]: I1007 13:34:17.968073 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw788" event={"ID":"5bce5223-6054-425b-8e83-888bb6794776","Type":"ContainerStarted","Data":"b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7"} Oct 07 13:34:18 crc kubenswrapper[4677]: I1007 13:34:18.976565 4677 generic.go:334] "Generic (PLEG): container finished" podID="5bce5223-6054-425b-8e83-888bb6794776" containerID="b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7" exitCode=0 Oct 07 13:34:18 crc kubenswrapper[4677]: I1007 13:34:18.976629 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw788" event={"ID":"5bce5223-6054-425b-8e83-888bb6794776","Type":"ContainerDied","Data":"b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7"} Oct 07 13:34:19 crc kubenswrapper[4677]: I1007 13:34:19.987485 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw788" event={"ID":"5bce5223-6054-425b-8e83-888bb6794776","Type":"ContainerStarted","Data":"8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942"} Oct 07 13:34:20 crc kubenswrapper[4677]: I1007 13:34:20.020906 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bw788" podStartSLOduration=2.2767546100000002 podStartE2EDuration="6.020879752s" podCreationTimestamp="2025-10-07 13:34:14 +0000 UTC" firstStartedPulling="2025-10-07 13:34:15.954576775 +0000 UTC m=+1627.440285910" lastFinishedPulling="2025-10-07 13:34:19.698701907 +0000 UTC m=+1631.184411052" observedRunningTime="2025-10-07 13:34:20.017616158 +0000 UTC m=+1631.503325363" watchObservedRunningTime="2025-10-07 13:34:20.020879752 +0000 UTC m=+1631.506588877" Oct 07 13:34:25 crc kubenswrapper[4677]: I1007 13:34:25.154620 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:25 crc kubenswrapper[4677]: I1007 13:34:25.154891 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:25 crc kubenswrapper[4677]: I1007 13:34:25.220203 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:26 crc kubenswrapper[4677]: I1007 13:34:26.112593 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:26 crc kubenswrapper[4677]: I1007 13:34:26.176537 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw788"] Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.048199 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bw788" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="registry-server" containerID="cri-o://8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942" gracePeriod=2 Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.414069 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.477468 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-catalog-content\") pod \"5bce5223-6054-425b-8e83-888bb6794776\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.477534 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqf7m\" (UniqueName: \"kubernetes.io/projected/5bce5223-6054-425b-8e83-888bb6794776-kube-api-access-rqf7m\") pod \"5bce5223-6054-425b-8e83-888bb6794776\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.477584 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-utilities\") pod \"5bce5223-6054-425b-8e83-888bb6794776\" (UID: \"5bce5223-6054-425b-8e83-888bb6794776\") " Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.478612 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-utilities" (OuterVolumeSpecName: "utilities") pod "5bce5223-6054-425b-8e83-888bb6794776" (UID: "5bce5223-6054-425b-8e83-888bb6794776"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.482557 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bce5223-6054-425b-8e83-888bb6794776-kube-api-access-rqf7m" (OuterVolumeSpecName: "kube-api-access-rqf7m") pod "5bce5223-6054-425b-8e83-888bb6794776" (UID: "5bce5223-6054-425b-8e83-888bb6794776"). InnerVolumeSpecName "kube-api-access-rqf7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.495068 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bce5223-6054-425b-8e83-888bb6794776" (UID: "5bce5223-6054-425b-8e83-888bb6794776"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.578988 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.579058 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqf7m\" (UniqueName: \"kubernetes.io/projected/5bce5223-6054-425b-8e83-888bb6794776-kube-api-access-rqf7m\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:28 crc kubenswrapper[4677]: I1007 13:34:28.579075 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bce5223-6054-425b-8e83-888bb6794776-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.060328 4677 generic.go:334] "Generic (PLEG): container finished" podID="5bce5223-6054-425b-8e83-888bb6794776" containerID="8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942" exitCode=0 Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.060371 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw788" event={"ID":"5bce5223-6054-425b-8e83-888bb6794776","Type":"ContainerDied","Data":"8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942"} Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.061742 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw788" event={"ID":"5bce5223-6054-425b-8e83-888bb6794776","Type":"ContainerDied","Data":"2c5c0821a5488e671410a8dececc28de4a53ad07da2114b52f59f264416d4cd5"} Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.061834 4677 scope.go:117] "RemoveContainer" containerID="8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.060597 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw788" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.084848 4677 scope.go:117] "RemoveContainer" containerID="b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.091010 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw788"] Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.096800 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw788"] Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.114097 4677 scope.go:117] "RemoveContainer" containerID="114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.136014 4677 scope.go:117] "RemoveContainer" containerID="8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942" Oct 07 13:34:29 crc kubenswrapper[4677]: E1007 13:34:29.136505 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942\": container with ID starting with 8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942 not found: ID does not exist" containerID="8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.136555 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942"} err="failed to get container status \"8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942\": rpc error: code = NotFound desc = could not find container \"8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942\": container with ID starting with 8b39040c8cd52c0580e855b32ab6d93c3ac071ae3d966fa9a35f843166a5e942 not found: ID does not exist" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.136587 4677 scope.go:117] "RemoveContainer" containerID="b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7" Oct 07 13:34:29 crc kubenswrapper[4677]: E1007 13:34:29.136991 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7\": container with ID starting with b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7 not found: ID does not exist" containerID="b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.137016 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7"} err="failed to get container status \"b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7\": rpc error: code = NotFound desc = could not find container \"b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7\": container with ID starting with b6272a8677008670a76807774b8ffed3dfb313b28a7788b63efdbfa5437caaf7 not found: ID does not exist" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.137035 4677 scope.go:117] "RemoveContainer" containerID="114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be" Oct 07 13:34:29 crc kubenswrapper[4677]: E1007 13:34:29.137365 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be\": container with ID starting with 114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be not found: ID does not exist" containerID="114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.137450 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be"} err="failed to get container status \"114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be\": rpc error: code = NotFound desc = could not find container \"114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be\": container with ID starting with 114e29511ff3a1bc1ba9371fde03c517954eb3eaae430149fbc07c4aa87cd2be not found: ID does not exist" Oct 07 13:34:29 crc kubenswrapper[4677]: I1007 13:34:29.312524 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bce5223-6054-425b-8e83-888bb6794776" path="/var/lib/kubelet/pods/5bce5223-6054-425b-8e83-888bb6794776/volumes" Oct 07 13:34:40 crc kubenswrapper[4677]: I1007 13:34:40.918288 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:34:40 crc kubenswrapper[4677]: I1007 13:34:40.919243 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:35:10 crc kubenswrapper[4677]: I1007 13:35:10.917975 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:35:10 crc kubenswrapper[4677]: I1007 13:35:10.920234 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:35:10 crc kubenswrapper[4677]: I1007 13:35:10.920494 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:35:10 crc kubenswrapper[4677]: I1007 13:35:10.921732 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:35:10 crc kubenswrapper[4677]: I1007 13:35:10.922012 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" gracePeriod=600 Oct 07 13:35:11 crc kubenswrapper[4677]: E1007 13:35:11.054821 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:35:11 crc kubenswrapper[4677]: I1007 13:35:11.385810 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" exitCode=0 Oct 07 13:35:11 crc kubenswrapper[4677]: I1007 13:35:11.385898 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b"} Oct 07 13:35:11 crc kubenswrapper[4677]: I1007 13:35:11.386151 4677 scope.go:117] "RemoveContainer" containerID="ae51911f66a891e48f90a715757af601cb9a83227dd098a756b0a7910507b4bc" Oct 07 13:35:11 crc kubenswrapper[4677]: I1007 13:35:11.386677 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:35:11 crc kubenswrapper[4677]: E1007 13:35:11.386884 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:35:22 crc kubenswrapper[4677]: I1007 13:35:22.303783 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:35:22 crc kubenswrapper[4677]: E1007 13:35:22.304568 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:35:34 crc kubenswrapper[4677]: I1007 13:35:34.302983 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:35:34 crc kubenswrapper[4677]: E1007 13:35:34.303503 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:35:48 crc kubenswrapper[4677]: I1007 13:35:48.303648 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:35:48 crc kubenswrapper[4677]: E1007 13:35:48.304812 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:36:00 crc kubenswrapper[4677]: I1007 13:36:00.303757 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:36:00 crc kubenswrapper[4677]: E1007 13:36:00.304834 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:36:12 crc kubenswrapper[4677]: I1007 13:36:12.303321 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:36:12 crc kubenswrapper[4677]: E1007 13:36:12.304341 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:36:24 crc kubenswrapper[4677]: I1007 13:36:24.302923 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:36:24 crc kubenswrapper[4677]: E1007 13:36:24.303667 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:36:37 crc kubenswrapper[4677]: I1007 13:36:37.304043 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:36:37 crc kubenswrapper[4677]: E1007 13:36:37.304711 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:36:50 crc kubenswrapper[4677]: I1007 13:36:50.303325 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:36:50 crc kubenswrapper[4677]: E1007 13:36:50.304230 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:37:03 crc kubenswrapper[4677]: I1007 13:37:03.302849 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:37:03 crc kubenswrapper[4677]: E1007 13:37:03.303425 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:37:14 crc kubenswrapper[4677]: I1007 13:37:14.302896 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:37:14 crc kubenswrapper[4677]: E1007 13:37:14.303732 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:37:25 crc kubenswrapper[4677]: I1007 13:37:25.061681 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-zhhhm"] Oct 07 13:37:25 crc kubenswrapper[4677]: I1007 13:37:25.067852 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-zhhhm"] Oct 07 13:37:25 crc kubenswrapper[4677]: I1007 13:37:25.303645 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:37:25 crc kubenswrapper[4677]: E1007 13:37:25.303923 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:37:25 crc kubenswrapper[4677]: I1007 13:37:25.312168 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1311532b-7d0f-44bd-9e3b-4f910a018031" path="/var/lib/kubelet/pods/1311532b-7d0f-44bd-9e3b-4f910a018031/volumes" Oct 07 13:37:35 crc kubenswrapper[4677]: I1007 13:37:35.042975 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-9e17-account-create-bpbk4"] Oct 07 13:37:35 crc kubenswrapper[4677]: I1007 13:37:35.048783 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-9e17-account-create-bpbk4"] Oct 07 13:37:35 crc kubenswrapper[4677]: I1007 13:37:35.315189 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd04856-1b05-4718-965c-95a0e396c5d6" path="/var/lib/kubelet/pods/0dd04856-1b05-4718-965c-95a0e396c5d6/volumes" Oct 07 13:37:40 crc kubenswrapper[4677]: I1007 13:37:40.303579 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:37:40 crc kubenswrapper[4677]: E1007 13:37:40.303991 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:37:42 crc kubenswrapper[4677]: I1007 13:37:42.024123 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-76wjh"] Oct 07 13:37:42 crc kubenswrapper[4677]: I1007 13:37:42.030149 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-76wjh"] Oct 07 13:37:43 crc kubenswrapper[4677]: I1007 13:37:43.315389 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c" path="/var/lib/kubelet/pods/57e4acb7-6ca2-4ca3-acfe-b1cb5f23917c/volumes" Oct 07 13:37:48 crc kubenswrapper[4677]: I1007 13:37:48.021964 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-jghjv"] Oct 07 13:37:48 crc kubenswrapper[4677]: I1007 13:37:48.031957 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-jghjv"] Oct 07 13:37:49 crc kubenswrapper[4677]: I1007 13:37:49.312815 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aad62c8-faba-4b12-a924-19089f667587" path="/var/lib/kubelet/pods/0aad62c8-faba-4b12-a924-19089f667587/volumes" Oct 07 13:37:53 crc kubenswrapper[4677]: I1007 13:37:53.303685 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:37:53 crc kubenswrapper[4677]: E1007 13:37:53.304281 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:38:00 crc kubenswrapper[4677]: I1007 13:38:00.880522 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 07 13:38:00 crc kubenswrapper[4677]: I1007 13:38:00.881069 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstackclient" podUID="94dc3f8c-381e-451a-abd2-802c99de6e6b" containerName="openstackclient" containerID="cri-o://34c762bd95770673880dfbe776d4e1d789ab5a131e4de939949d0cfc6517da1a" gracePeriod=30 Oct 07 13:38:01 crc kubenswrapper[4677]: I1007 13:38:01.694326 4677 generic.go:334] "Generic (PLEG): container finished" podID="94dc3f8c-381e-451a-abd2-802c99de6e6b" containerID="34c762bd95770673880dfbe776d4e1d789ab5a131e4de939949d0cfc6517da1a" exitCode=143 Oct 07 13:38:01 crc kubenswrapper[4677]: I1007 13:38:01.694374 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"94dc3f8c-381e-451a-abd2-802c99de6e6b","Type":"ContainerDied","Data":"34c762bd95770673880dfbe776d4e1d789ab5a131e4de939949d0cfc6517da1a"} Oct 07 13:38:01 crc kubenswrapper[4677]: I1007 13:38:01.953340 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.073551 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config\") pod \"94dc3f8c-381e-451a-abd2-802c99de6e6b\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.073654 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config-secret\") pod \"94dc3f8c-381e-451a-abd2-802c99de6e6b\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.073701 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzv64\" (UniqueName: \"kubernetes.io/projected/94dc3f8c-381e-451a-abd2-802c99de6e6b-kube-api-access-gzv64\") pod \"94dc3f8c-381e-451a-abd2-802c99de6e6b\" (UID: \"94dc3f8c-381e-451a-abd2-802c99de6e6b\") " Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.081185 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94dc3f8c-381e-451a-abd2-802c99de6e6b-kube-api-access-gzv64" (OuterVolumeSpecName: "kube-api-access-gzv64") pod "94dc3f8c-381e-451a-abd2-802c99de6e6b" (UID: "94dc3f8c-381e-451a-abd2-802c99de6e6b"). InnerVolumeSpecName "kube-api-access-gzv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.099251 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "94dc3f8c-381e-451a-abd2-802c99de6e6b" (UID: "94dc3f8c-381e-451a-abd2-802c99de6e6b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.102345 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "94dc3f8c-381e-451a-abd2-802c99de6e6b" (UID: "94dc3f8c-381e-451a-abd2-802c99de6e6b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.175276 4677 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.175308 4677 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94dc3f8c-381e-451a-abd2-802c99de6e6b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.175319 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzv64\" (UniqueName: \"kubernetes.io/projected/94dc3f8c-381e-451a-abd2-802c99de6e6b-kube-api-access-gzv64\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.712257 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"94dc3f8c-381e-451a-abd2-802c99de6e6b","Type":"ContainerDied","Data":"6d649fa0e2636a1f3d9b9566048b8df712474f351abb8e3646ca73084d447b17"} Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.712343 4677 scope.go:117] "RemoveContainer" containerID="34c762bd95770673880dfbe776d4e1d789ab5a131e4de939949d0cfc6517da1a" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.712638 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.756831 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.765147 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.889807 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw"] Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.890598 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" podUID="c4ee04d7-9071-415d-94cd-daaace73b138" containerName="keystone-api" containerID="cri-o://e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae" gracePeriod=30 Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.948832 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone9e17-account-delete-fn2tq"] Oct 07 13:38:02 crc kubenswrapper[4677]: E1007 13:38:02.949076 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="extract-content" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.949090 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="extract-content" Oct 07 13:38:02 crc kubenswrapper[4677]: E1007 13:38:02.949103 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dc3f8c-381e-451a-abd2-802c99de6e6b" containerName="openstackclient" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.949109 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dc3f8c-381e-451a-abd2-802c99de6e6b" containerName="openstackclient" Oct 07 13:38:02 crc kubenswrapper[4677]: E1007 13:38:02.949119 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="extract-utilities" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.949125 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="extract-utilities" Oct 07 13:38:02 crc kubenswrapper[4677]: E1007 13:38:02.949137 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="registry-server" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.949142 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="registry-server" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.949260 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bce5223-6054-425b-8e83-888bb6794776" containerName="registry-server" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.949270 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dc3f8c-381e-451a-abd2-802c99de6e6b" containerName="openstackclient" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.949744 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.957823 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone9e17-account-delete-fn2tq"] Oct 07 13:38:02 crc kubenswrapper[4677]: I1007 13:38:02.973948 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone9e17-account-delete-fn2tq"] Oct 07 13:38:02 crc kubenswrapper[4677]: E1007 13:38:02.974703 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xgjh6], unattached volumes=[], failed to process volumes=[kube-api-access-xgjh6]: context canceled" pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" podUID="24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58" Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.089296 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgjh6\" (UniqueName: \"kubernetes.io/projected/24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58-kube-api-access-xgjh6\") pod \"keystone9e17-account-delete-fn2tq\" (UID: \"24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58\") " pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.190329 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgjh6\" (UniqueName: \"kubernetes.io/projected/24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58-kube-api-access-xgjh6\") pod \"keystone9e17-account-delete-fn2tq\" (UID: \"24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58\") " pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.213421 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgjh6\" (UniqueName: \"kubernetes.io/projected/24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58-kube-api-access-xgjh6\") pod \"keystone9e17-account-delete-fn2tq\" (UID: \"24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58\") " pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.313040 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94dc3f8c-381e-451a-abd2-802c99de6e6b" path="/var/lib/kubelet/pods/94dc3f8c-381e-451a-abd2-802c99de6e6b/volumes" Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.723334 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.733212 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.898876 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgjh6\" (UniqueName: \"kubernetes.io/projected/24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58-kube-api-access-xgjh6\") pod \"24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58\" (UID: \"24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58\") " Oct 07 13:38:03 crc kubenswrapper[4677]: I1007 13:38:03.902605 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58-kube-api-access-xgjh6" (OuterVolumeSpecName: "kube-api-access-xgjh6") pod "24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58" (UID: "24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58"). InnerVolumeSpecName "kube-api-access-xgjh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:04 crc kubenswrapper[4677]: I1007 13:38:04.000694 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgjh6\" (UniqueName: \"kubernetes.io/projected/24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58-kube-api-access-xgjh6\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:04 crc kubenswrapper[4677]: I1007 13:38:04.728976 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone9e17-account-delete-fn2tq" Oct 07 13:38:04 crc kubenswrapper[4677]: I1007 13:38:04.779748 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone9e17-account-delete-fn2tq"] Oct 07 13:38:04 crc kubenswrapper[4677]: I1007 13:38:04.788133 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone9e17-account-delete-fn2tq"] Oct 07 13:38:05 crc kubenswrapper[4677]: I1007 13:38:05.313083 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58" path="/var/lib/kubelet/pods/24ed40b2-9bdb-461a-b4e6-be2f0d2c8c58/volumes" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.384281 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.531665 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-scripts\") pod \"c4ee04d7-9071-415d-94cd-daaace73b138\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.531808 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pv57\" (UniqueName: \"kubernetes.io/projected/c4ee04d7-9071-415d-94cd-daaace73b138-kube-api-access-6pv57\") pod \"c4ee04d7-9071-415d-94cd-daaace73b138\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.531865 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-fernet-keys\") pod \"c4ee04d7-9071-415d-94cd-daaace73b138\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.531890 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-config-data\") pod \"c4ee04d7-9071-415d-94cd-daaace73b138\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.531909 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-credential-keys\") pod \"c4ee04d7-9071-415d-94cd-daaace73b138\" (UID: \"c4ee04d7-9071-415d-94cd-daaace73b138\") " Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.536558 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c4ee04d7-9071-415d-94cd-daaace73b138" (UID: "c4ee04d7-9071-415d-94cd-daaace73b138"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.536940 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c4ee04d7-9071-415d-94cd-daaace73b138" (UID: "c4ee04d7-9071-415d-94cd-daaace73b138"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.537105 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ee04d7-9071-415d-94cd-daaace73b138-kube-api-access-6pv57" (OuterVolumeSpecName: "kube-api-access-6pv57") pod "c4ee04d7-9071-415d-94cd-daaace73b138" (UID: "c4ee04d7-9071-415d-94cd-daaace73b138"). InnerVolumeSpecName "kube-api-access-6pv57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.537475 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-scripts" (OuterVolumeSpecName: "scripts") pod "c4ee04d7-9071-415d-94cd-daaace73b138" (UID: "c4ee04d7-9071-415d-94cd-daaace73b138"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.567328 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-config-data" (OuterVolumeSpecName: "config-data") pod "c4ee04d7-9071-415d-94cd-daaace73b138" (UID: "c4ee04d7-9071-415d-94cd-daaace73b138"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.633065 4677 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.633107 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pv57\" (UniqueName: \"kubernetes.io/projected/c4ee04d7-9071-415d-94cd-daaace73b138-kube-api-access-6pv57\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.633122 4677 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.633137 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.633149 4677 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ee04d7-9071-415d-94cd-daaace73b138-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.748032 4677 generic.go:334] "Generic (PLEG): container finished" podID="c4ee04d7-9071-415d-94cd-daaace73b138" containerID="e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae" exitCode=0 Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.748114 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" event={"ID":"c4ee04d7-9071-415d-94cd-daaace73b138","Type":"ContainerDied","Data":"e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae"} Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.748165 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" event={"ID":"c4ee04d7-9071-415d-94cd-daaace73b138","Type":"ContainerDied","Data":"e3ec609018d4090adc41e9c692194e95d0fd44715f28ae71bacd11ab8b136711"} Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.748204 4677 scope.go:117] "RemoveContainer" containerID="e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.748221 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.778836 4677 scope.go:117] "RemoveContainer" containerID="e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae" Oct 07 13:38:06 crc kubenswrapper[4677]: E1007 13:38:06.779510 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae\": container with ID starting with e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae not found: ID does not exist" containerID="e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.779564 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae"} err="failed to get container status \"e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae\": rpc error: code = NotFound desc = could not find container \"e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae\": container with ID starting with e77af5fd03b269759c612e7fd4fbf7232232e59583939d50e3f782bd7ca84dae not found: ID does not exist" Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.793041 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw"] Oct 07 13:38:06 crc kubenswrapper[4677]: I1007 13:38:06.804214 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5f7d4d4854-cgrcw"] Oct 07 13:38:07 crc kubenswrapper[4677]: I1007 13:38:07.303054 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:38:07 crc kubenswrapper[4677]: E1007 13:38:07.303541 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:38:07 crc kubenswrapper[4677]: I1007 13:38:07.319590 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ee04d7-9071-415d-94cd-daaace73b138" path="/var/lib/kubelet/pods/c4ee04d7-9071-415d-94cd-daaace73b138/volumes" Oct 07 13:38:11 crc kubenswrapper[4677]: I1007 13:38:11.208183 4677 scope.go:117] "RemoveContainer" containerID="76febbdf1ccc81a3e62a38de278b17635aaf21f4031be6093e9400238f2a2c78" Oct 07 13:38:11 crc kubenswrapper[4677]: I1007 13:38:11.247721 4677 scope.go:117] "RemoveContainer" containerID="17387912a8ae0d2d62857729b05ff38794fa8183757efc09b0f6e842eef7fff3" Oct 07 13:38:11 crc kubenswrapper[4677]: I1007 13:38:11.284673 4677 scope.go:117] "RemoveContainer" containerID="d308a1b583636f423146a7c96204148801c49e5ecf3d98421cb67c20d059ab95" Oct 07 13:38:11 crc kubenswrapper[4677]: I1007 13:38:11.310725 4677 scope.go:117] "RemoveContainer" containerID="00d43c02a7940f8edba2ab6223c3c3c22c2dac93d3333a1aeac76156a786a00a" Oct 07 13:38:16 crc kubenswrapper[4677]: I1007 13:38:16.306159 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 07 13:38:16 crc kubenswrapper[4677]: I1007 13:38:16.316482 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 07 13:38:16 crc kubenswrapper[4677]: I1007 13:38:16.324249 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 07 13:38:16 crc kubenswrapper[4677]: I1007 13:38:16.417190 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerName="galera" containerID="cri-o://1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362" gracePeriod=30 Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.003105 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.003856 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/memcached-0" podUID="a8dabf1b-2f0f-4f7f-8342-31001928330b" containerName="memcached" containerID="cri-o://c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7" gracePeriod=30 Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.261106 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.384374 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rt5v\" (UniqueName: \"kubernetes.io/projected/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kube-api-access-7rt5v\") pod \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.384490 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kolla-config\") pod \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.384585 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-secrets\") pod \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.384659 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-generated\") pod \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.384685 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-operator-scripts\") pod \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.384830 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-default\") pod \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.384858 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\" (UID: \"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea\") " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.385288 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" (UID: "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.385634 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" (UID: "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.386030 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" (UID: "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.387035 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" (UID: "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.390773 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kube-api-access-7rt5v" (OuterVolumeSpecName: "kube-api-access-7rt5v") pod "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" (UID: "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea"). InnerVolumeSpecName "kube-api-access-7rt5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.391340 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-secrets" (OuterVolumeSpecName: "secrets") pod "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" (UID: "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.398023 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" (UID: "3b168b6a-df61-44ed-8a09-ed30d3ecc2ea"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.405817 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.486206 4677 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.486272 4677 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.486290 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rt5v\" (UniqueName: \"kubernetes.io/projected/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kube-api-access-7rt5v\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.486308 4677 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.486322 4677 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.486338 4677 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.486351 4677 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.504859 4677 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.587362 4677 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.844318 4677 generic.go:334] "Generic (PLEG): container finished" podID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerID="1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362" exitCode=0 Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.844395 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea","Type":"ContainerDied","Data":"1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362"} Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.844398 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.844504 4677 scope.go:117] "RemoveContainer" containerID="1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.844480 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3b168b6a-df61-44ed-8a09-ed30d3ecc2ea","Type":"ContainerDied","Data":"acb4fb6a91db044ecddfe1cc74d6d29398ed860806fc3c2b7742bc4ee71d5da7"} Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.901472 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.903901 4677 scope.go:117] "RemoveContainer" containerID="7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.934468 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.946545 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.957797 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/rabbitmq-server-0" podUID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerName="rabbitmq" containerID="cri-o://f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf" gracePeriod=604800 Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.969534 4677 scope.go:117] "RemoveContainer" containerID="1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362" Oct 07 13:38:17 crc kubenswrapper[4677]: E1007 13:38:17.969851 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362\": container with ID starting with 1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362 not found: ID does not exist" containerID="1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.969883 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362"} err="failed to get container status \"1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362\": rpc error: code = NotFound desc = could not find container \"1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362\": container with ID starting with 1018dc138a07f457eeff18e8d87b096ccab51a8c0aea520899638fb2f9fdd362 not found: ID does not exist" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.969902 4677 scope.go:117] "RemoveContainer" containerID="7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96" Oct 07 13:38:17 crc kubenswrapper[4677]: E1007 13:38:17.970197 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96\": container with ID starting with 7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96 not found: ID does not exist" containerID="7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96" Oct 07 13:38:17 crc kubenswrapper[4677]: I1007 13:38:17.970223 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96"} err="failed to get container status \"7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96\": rpc error: code = NotFound desc = could not find container \"7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96\": container with ID starting with 7618a0b64d72ab769046158fe526d0ea6873dff05516b608e89c7452bb5ecf96 not found: ID does not exist" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.283489 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.399162 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gdc9\" (UniqueName: \"kubernetes.io/projected/a8dabf1b-2f0f-4f7f-8342-31001928330b-kube-api-access-4gdc9\") pod \"a8dabf1b-2f0f-4f7f-8342-31001928330b\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.399255 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-config-data\") pod \"a8dabf1b-2f0f-4f7f-8342-31001928330b\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.399367 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-kolla-config\") pod \"a8dabf1b-2f0f-4f7f-8342-31001928330b\" (UID: \"a8dabf1b-2f0f-4f7f-8342-31001928330b\") " Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.400690 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a8dabf1b-2f0f-4f7f-8342-31001928330b" (UID: "a8dabf1b-2f0f-4f7f-8342-31001928330b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.400721 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-config-data" (OuterVolumeSpecName: "config-data") pod "a8dabf1b-2f0f-4f7f-8342-31001928330b" (UID: "a8dabf1b-2f0f-4f7f-8342-31001928330b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.406048 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8dabf1b-2f0f-4f7f-8342-31001928330b-kube-api-access-4gdc9" (OuterVolumeSpecName: "kube-api-access-4gdc9") pod "a8dabf1b-2f0f-4f7f-8342-31001928330b" (UID: "a8dabf1b-2f0f-4f7f-8342-31001928330b"). InnerVolumeSpecName "kube-api-access-4gdc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.468480 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-1" podUID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerName="galera" containerID="cri-o://86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de" gracePeriod=28 Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.500913 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gdc9\" (UniqueName: \"kubernetes.io/projected/a8dabf1b-2f0f-4f7f-8342-31001928330b-kube-api-access-4gdc9\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.500946 4677 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.500956 4677 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8dabf1b-2f0f-4f7f-8342-31001928330b-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.817063 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b"] Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.817318 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="manager" containerID="cri-o://3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079" gracePeriod=10 Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.817400 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="kube-rbac-proxy" containerID="cri-o://0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205" gracePeriod=10 Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.851310 4677 generic.go:334] "Generic (PLEG): container finished" podID="a8dabf1b-2f0f-4f7f-8342-31001928330b" containerID="c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7" exitCode=0 Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.851374 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"a8dabf1b-2f0f-4f7f-8342-31001928330b","Type":"ContainerDied","Data":"c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7"} Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.851409 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"a8dabf1b-2f0f-4f7f-8342-31001928330b","Type":"ContainerDied","Data":"58362ffda0c2edac31261f1351d114eee6f2f97fb6809d85dc3e390f14c55a78"} Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.851449 4677 scope.go:117] "RemoveContainer" containerID="c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7" Oct 07 13:38:18 crc kubenswrapper[4677]: I1007 13:38:18.851523 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.006258 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.010676 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.020661 4677 scope.go:117] "RemoveContainer" containerID="c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7" Oct 07 13:38:19 crc kubenswrapper[4677]: E1007 13:38:19.029617 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7\": container with ID starting with c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7 not found: ID does not exist" containerID="c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.029663 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7"} err="failed to get container status \"c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7\": rpc error: code = NotFound desc = could not find container \"c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7\": container with ID starting with c1ca83bd1f6257f9e5b9a6701b8f358537b3598c83dd3dcf8388a9277d9189f7 not found: ID does not exist" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.123705 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-dptwb"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.123982 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-dptwb" podUID="869e5c88-2679-4b90-b674-bba233dc88e0" containerName="registry-server" containerID="cri-o://cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b" gracePeriod=30 Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.164038 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.167550 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/3d8ae2547c933a67aba0b0e56cba361860dfd1d9c4832c2b34acb572d778bsn"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.305681 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:38:19 crc kubenswrapper[4677]: E1007 13:38:19.305949 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.328844 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" path="/var/lib/kubelet/pods/3b168b6a-df61-44ed-8a09-ed30d3ecc2ea/volumes" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.329518 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8dabf1b-2f0f-4f7f-8342-31001928330b" path="/var/lib/kubelet/pods/a8dabf1b-2f0f-4f7f-8342-31001928330b/volumes" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.330070 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b65f72-d122-49d5-83dd-06541b985a21" path="/var/lib/kubelet/pods/f6b65f72-d122-49d5-83dd-06541b985a21/volumes" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.354288 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.387547 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.511797 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-erlang-cookie\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.511856 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-apiservice-cert\") pod \"5f1035cf-bd50-4425-a342-92a71ab7f16e\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.511925 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-confd\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.511973 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c50e7113-f37a-4ea1-9d53-c53106564a48-plugins-conf\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512006 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c50e7113-f37a-4ea1-9d53-c53106564a48-erlang-cookie-secret\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512034 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-plugins\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512155 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512186 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-webhook-cert\") pod \"5f1035cf-bd50-4425-a342-92a71ab7f16e\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512252 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c50e7113-f37a-4ea1-9d53-c53106564a48-pod-info\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512273 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzwbp\" (UniqueName: \"kubernetes.io/projected/5f1035cf-bd50-4425-a342-92a71ab7f16e-kube-api-access-fzwbp\") pod \"5f1035cf-bd50-4425-a342-92a71ab7f16e\" (UID: \"5f1035cf-bd50-4425-a342-92a71ab7f16e\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512311 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdrlz\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-kube-api-access-tdrlz\") pod \"c50e7113-f37a-4ea1-9d53-c53106564a48\" (UID: \"c50e7113-f37a-4ea1-9d53-c53106564a48\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512384 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512834 4677 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.512832 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.514924 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c50e7113-f37a-4ea1-9d53-c53106564a48-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.517174 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-kube-api-access-tdrlz" (OuterVolumeSpecName: "kube-api-access-tdrlz") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "kube-api-access-tdrlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.517817 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "5f1035cf-bd50-4425-a342-92a71ab7f16e" (UID: "5f1035cf-bd50-4425-a342-92a71ab7f16e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.519289 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "5f1035cf-bd50-4425-a342-92a71ab7f16e" (UID: "5f1035cf-bd50-4425-a342-92a71ab7f16e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.520199 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c50e7113-f37a-4ea1-9d53-c53106564a48-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.521448 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1035cf-bd50-4425-a342-92a71ab7f16e-kube-api-access-fzwbp" (OuterVolumeSpecName: "kube-api-access-fzwbp") pod "5f1035cf-bd50-4425-a342-92a71ab7f16e" (UID: "5f1035cf-bd50-4425-a342-92a71ab7f16e"). InnerVolumeSpecName "kube-api-access-fzwbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.524835 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c50e7113-f37a-4ea1-9d53-c53106564a48-pod-info" (OuterVolumeSpecName: "pod-info") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.531148 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1" (OuterVolumeSpecName: "persistence") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.571042 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c50e7113-f37a-4ea1-9d53-c53106564a48" (UID: "c50e7113-f37a-4ea1-9d53-c53106564a48"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.592565 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616090 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzwbp\" (UniqueName: \"kubernetes.io/projected/5f1035cf-bd50-4425-a342-92a71ab7f16e-kube-api-access-fzwbp\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616120 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdrlz\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-kube-api-access-tdrlz\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616130 4677 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616139 4677 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616148 4677 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c50e7113-f37a-4ea1-9d53-c53106564a48-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616156 4677 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c50e7113-f37a-4ea1-9d53-c53106564a48-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616164 4677 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c50e7113-f37a-4ea1-9d53-c53106564a48-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616199 4677 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\") on node \"crc\" " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616209 4677 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1035cf-bd50-4425-a342-92a71ab7f16e-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.616218 4677 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c50e7113-f37a-4ea1-9d53-c53106564a48-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.631846 4677 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.631993 4677 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1") on node "crc" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.717255 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwxqf\" (UniqueName: \"kubernetes.io/projected/869e5c88-2679-4b90-b674-bba233dc88e0-kube-api-access-bwxqf\") pod \"869e5c88-2679-4b90-b674-bba233dc88e0\" (UID: \"869e5c88-2679-4b90-b674-bba233dc88e0\") " Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.717601 4677 reconciler_common.go:293] "Volume detached for volume \"pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa3a4c81-a114-45bc-95e2-8ce46e6940f1\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.719986 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869e5c88-2679-4b90-b674-bba233dc88e0-kube-api-access-bwxqf" (OuterVolumeSpecName: "kube-api-access-bwxqf") pod "869e5c88-2679-4b90-b674-bba233dc88e0" (UID: "869e5c88-2679-4b90-b674-bba233dc88e0"). InnerVolumeSpecName "kube-api-access-bwxqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.819698 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwxqf\" (UniqueName: \"kubernetes.io/projected/869e5c88-2679-4b90-b674-bba233dc88e0-kube-api-access-bwxqf\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.874806 4677 generic.go:334] "Generic (PLEG): container finished" podID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerID="f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf" exitCode=0 Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.874916 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"c50e7113-f37a-4ea1-9d53-c53106564a48","Type":"ContainerDied","Data":"f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf"} Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.874956 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"c50e7113-f37a-4ea1-9d53-c53106564a48","Type":"ContainerDied","Data":"b8337d16321f5fa19f2ed880701bf97c952fffc2dc4005f8c06cdfabae088f16"} Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.874953 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.875002 4677 scope.go:117] "RemoveContainer" containerID="f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.879083 4677 generic.go:334] "Generic (PLEG): container finished" podID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerID="0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205" exitCode=0 Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.879149 4677 generic.go:334] "Generic (PLEG): container finished" podID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerID="3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079" exitCode=0 Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.879244 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" event={"ID":"5f1035cf-bd50-4425-a342-92a71ab7f16e","Type":"ContainerDied","Data":"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205"} Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.879298 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" event={"ID":"5f1035cf-bd50-4425-a342-92a71ab7f16e","Type":"ContainerDied","Data":"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079"} Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.879326 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" event={"ID":"5f1035cf-bd50-4425-a342-92a71ab7f16e","Type":"ContainerDied","Data":"bc3bb6d890a0a2f2a5b6b0ced856021f9911ef39c61f806ebc9c04bfd0024754"} Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.879425 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.888711 4677 generic.go:334] "Generic (PLEG): container finished" podID="869e5c88-2679-4b90-b674-bba233dc88e0" containerID="cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b" exitCode=0 Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.888771 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dptwb" event={"ID":"869e5c88-2679-4b90-b674-bba233dc88e0","Type":"ContainerDied","Data":"cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b"} Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.888826 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dptwb" event={"ID":"869e5c88-2679-4b90-b674-bba233dc88e0","Type":"ContainerDied","Data":"a6a12022e0bc5bea56657c9e9d7efbe0461519c6663e2a1c68e13bce0e63dee8"} Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.888947 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dptwb" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.915921 4677 scope.go:117] "RemoveContainer" containerID="4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.938121 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.943083 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.949515 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.953422 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-54986549bb-c7n9b"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.961391 4677 scope.go:117] "RemoveContainer" containerID="f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf" Oct 07 13:38:19 crc kubenswrapper[4677]: E1007 13:38:19.962534 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf\": container with ID starting with f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf not found: ID does not exist" containerID="f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.962567 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf"} err="failed to get container status \"f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf\": rpc error: code = NotFound desc = could not find container \"f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf\": container with ID starting with f59b7e3688a9b8b62920e8dfb9ee849dfa652fd717a87b9d6e0b5218753bf5bf not found: ID does not exist" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.962592 4677 scope.go:117] "RemoveContainer" containerID="4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.962911 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-dptwb"] Oct 07 13:38:19 crc kubenswrapper[4677]: E1007 13:38:19.963159 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11\": container with ID starting with 4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11 not found: ID does not exist" containerID="4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.963189 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11"} err="failed to get container status \"4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11\": rpc error: code = NotFound desc = could not find container \"4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11\": container with ID starting with 4bd0bb354d34a9f6279b5aa80df037c139c91e480664fe124c3295f33023ff11 not found: ID does not exist" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.963209 4677 scope.go:117] "RemoveContainer" containerID="0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.968933 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-dptwb"] Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.977642 4677 scope.go:117] "RemoveContainer" containerID="3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.992404 4677 scope.go:117] "RemoveContainer" containerID="0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205" Oct 07 13:38:19 crc kubenswrapper[4677]: E1007 13:38:19.992751 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205\": container with ID starting with 0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205 not found: ID does not exist" containerID="0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.992848 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205"} err="failed to get container status \"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205\": rpc error: code = NotFound desc = could not find container \"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205\": container with ID starting with 0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205 not found: ID does not exist" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.993010 4677 scope.go:117] "RemoveContainer" containerID="3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079" Oct 07 13:38:19 crc kubenswrapper[4677]: E1007 13:38:19.993357 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079\": container with ID starting with 3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079 not found: ID does not exist" containerID="3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.993452 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079"} err="failed to get container status \"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079\": rpc error: code = NotFound desc = could not find container \"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079\": container with ID starting with 3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079 not found: ID does not exist" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.993517 4677 scope.go:117] "RemoveContainer" containerID="0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.993904 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205"} err="failed to get container status \"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205\": rpc error: code = NotFound desc = could not find container \"0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205\": container with ID starting with 0116fd6016f6ae8f8544a6f2bb9eaa3cd0cf9e7ecb9b56eaebdb22682d49d205 not found: ID does not exist" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.993967 4677 scope.go:117] "RemoveContainer" containerID="3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.994266 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079"} err="failed to get container status \"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079\": rpc error: code = NotFound desc = could not find container \"3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079\": container with ID starting with 3c14f8b5cf2ad85d869cde7f7f826e8c464c4d90ede3f8867bd428890a3b0079 not found: ID does not exist" Oct 07 13:38:19 crc kubenswrapper[4677]: I1007 13:38:19.994362 4677 scope.go:117] "RemoveContainer" containerID="cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.145393 4677 scope.go:117] "RemoveContainer" containerID="cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b" Oct 07 13:38:20 crc kubenswrapper[4677]: E1007 13:38:20.145768 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b\": container with ID starting with cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b not found: ID does not exist" containerID="cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.145828 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b"} err="failed to get container status \"cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b\": rpc error: code = NotFound desc = could not find container \"cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b\": container with ID starting with cba042fd564e4bfb156e628e61aecc1fd4db9d45c4953dfe62fcdc4005a4d46b not found: ID does not exist" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.331686 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.425475 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.425545 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-default\") pod \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.425592 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kolla-config\") pod \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.425639 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-generated\") pod \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.425672 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kube-api-access-lsqp4\") pod \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.425738 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-secrets\") pod \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.425838 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-operator-scripts\") pod \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\" (UID: \"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8\") " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.426151 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" (UID: "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.426751 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" (UID: "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.426742 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" (UID: "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.427470 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" (UID: "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.431207 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kube-api-access-lsqp4" (OuterVolumeSpecName: "kube-api-access-lsqp4") pod "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" (UID: "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8"). InnerVolumeSpecName "kube-api-access-lsqp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.437139 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" (UID: "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.441583 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-secrets" (OuterVolumeSpecName: "secrets") pod "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" (UID: "8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.478196 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-0" podUID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerName="galera" containerID="cri-o://6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9" gracePeriod=26 Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.527712 4677 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.527771 4677 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.527792 4677 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.527813 4677 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.527831 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-kube-api-access-lsqp4\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.527846 4677 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.527863 4677 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.537602 4677 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.629655 4677 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.900793 4677 generic.go:334] "Generic (PLEG): container finished" podID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerID="86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de" exitCode=0 Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.900878 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.900914 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8","Type":"ContainerDied","Data":"86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de"} Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.901319 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8","Type":"ContainerDied","Data":"eb154f906d6745cfc3c33141794e849a19ab7ae624714e5e93fe2191a9dc5e0e"} Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.901354 4677 scope.go:117] "RemoveContainer" containerID="86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.921701 4677 scope.go:117] "RemoveContainer" containerID="9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.950293 4677 scope.go:117] "RemoveContainer" containerID="86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.950769 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 07 13:38:20 crc kubenswrapper[4677]: E1007 13:38:20.950994 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de\": container with ID starting with 86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de not found: ID does not exist" containerID="86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.951048 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de"} err="failed to get container status \"86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de\": rpc error: code = NotFound desc = could not find container \"86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de\": container with ID starting with 86b85efb94bf982e8572ff8e51fc224f0050532519784aa519a0408928d5d6de not found: ID does not exist" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.951082 4677 scope.go:117] "RemoveContainer" containerID="9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246" Oct 07 13:38:20 crc kubenswrapper[4677]: E1007 13:38:20.951366 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246\": container with ID starting with 9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246 not found: ID does not exist" containerID="9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.951536 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246"} err="failed to get container status \"9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246\": rpc error: code = NotFound desc = could not find container \"9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246\": container with ID starting with 9d3359c7c136a9e1ea3a4a41290e3897c160364c2dc05f4779b52a6e74a12246 not found: ID does not exist" Oct 07 13:38:20 crc kubenswrapper[4677]: I1007 13:38:20.953631 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.311230 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" path="/var/lib/kubelet/pods/5f1035cf-bd50-4425-a342-92a71ab7f16e/volumes" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.312212 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869e5c88-2679-4b90-b674-bba233dc88e0" path="/var/lib/kubelet/pods/869e5c88-2679-4b90-b674-bba233dc88e0/volumes" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.313611 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" path="/var/lib/kubelet/pods/8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8/volumes" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.314975 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50e7113-f37a-4ea1-9d53-c53106564a48" path="/var/lib/kubelet/pods/c50e7113-f37a-4ea1-9d53-c53106564a48/volumes" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.405504 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.534743 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4"] Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.534944 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="manager" containerID="cri-o://9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1" gracePeriod=10 Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.535022 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="kube-rbac-proxy" containerID="cri-o://110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a" gracePeriod=10 Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.539645 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-default\") pod \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.539689 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-generated\") pod \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.539757 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kolla-config\") pod \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.539794 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbvqj\" (UniqueName: \"kubernetes.io/projected/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kube-api-access-zbvqj\") pod \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.539811 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.539840 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-operator-scripts\") pod \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.539863 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-secrets\") pod \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\" (UID: \"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177\") " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.540339 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" (UID: "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.540361 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" (UID: "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.540576 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" (UID: "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.540993 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" (UID: "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.546398 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-secrets" (OuterVolumeSpecName: "secrets") pod "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" (UID: "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.551600 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kube-api-access-zbvqj" (OuterVolumeSpecName: "kube-api-access-zbvqj") pod "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" (UID: "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177"). InnerVolumeSpecName "kube-api-access-zbvqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.565160 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" (UID: "bf5b66ae-3d1d-4299-bfe7-d3f3eb705177"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.641656 4677 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.641922 4677 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.641933 4677 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.641943 4677 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.641952 4677 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.641960 4677 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.641968 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbvqj\" (UniqueName: \"kubernetes.io/projected/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177-kube-api-access-zbvqj\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.659073 4677 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.743336 4677 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.801720 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-mjjn9"] Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.801951 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-mjjn9" podUID="3d440c3d-e4d7-4b91-87e7-0f73d587d638" containerName="registry-server" containerID="cri-o://108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703" gracePeriod=30 Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.814006 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r"] Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.818054 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/4cb402f945d54a80688ab4565a4e9d19d5d4eb730a5ce0fdf7f49eb313zx98r"] Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.883387 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.909538 4677 generic.go:334] "Generic (PLEG): container finished" podID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerID="6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9" exitCode=0 Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.909592 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.909612 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177","Type":"ContainerDied","Data":"6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9"} Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.909649 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"bf5b66ae-3d1d-4299-bfe7-d3f3eb705177","Type":"ContainerDied","Data":"b5bb87dc83ab68feab18d3b344507b1f98c1608cb2e1c961e06ad1a1a3d42bae"} Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.909671 4677 scope.go:117] "RemoveContainer" containerID="6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.913016 4677 generic.go:334] "Generic (PLEG): container finished" podID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerID="110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a" exitCode=0 Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.913038 4677 generic.go:334] "Generic (PLEG): container finished" podID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerID="9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1" exitCode=0 Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.913076 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" event={"ID":"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e","Type":"ContainerDied","Data":"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a"} Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.913101 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" event={"ID":"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e","Type":"ContainerDied","Data":"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1"} Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.913110 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" event={"ID":"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e","Type":"ContainerDied","Data":"3ef030ba7124d69940a36cebcd43162e1cc6736ed714d1c9e2311a8803556b32"} Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.913158 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.929007 4677 scope.go:117] "RemoveContainer" containerID="fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.941913 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.945042 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.950782 4677 scope.go:117] "RemoveContainer" containerID="6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9" Oct 07 13:38:21 crc kubenswrapper[4677]: E1007 13:38:21.951152 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9\": container with ID starting with 6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9 not found: ID does not exist" containerID="6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.951191 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9"} err="failed to get container status \"6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9\": rpc error: code = NotFound desc = could not find container \"6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9\": container with ID starting with 6c1488e323d006e4b4e77fd48cf7b100005e7d12979a7581d721031c896a6bd9 not found: ID does not exist" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.951219 4677 scope.go:117] "RemoveContainer" containerID="fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160" Oct 07 13:38:21 crc kubenswrapper[4677]: E1007 13:38:21.951529 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160\": container with ID starting with fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160 not found: ID does not exist" containerID="fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.951553 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160"} err="failed to get container status \"fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160\": rpc error: code = NotFound desc = could not find container \"fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160\": container with ID starting with fb181f0f4644fb4c7ac9cfd18869ed5b8489d38d377c7d7ab7d3d9535c1db160 not found: ID does not exist" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.951571 4677 scope.go:117] "RemoveContainer" containerID="110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.969531 4677 scope.go:117] "RemoveContainer" containerID="9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.984679 4677 scope.go:117] "RemoveContainer" containerID="110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a" Oct 07 13:38:21 crc kubenswrapper[4677]: E1007 13:38:21.985198 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a\": container with ID starting with 110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a not found: ID does not exist" containerID="110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.985239 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a"} err="failed to get container status \"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a\": rpc error: code = NotFound desc = could not find container \"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a\": container with ID starting with 110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a not found: ID does not exist" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.985265 4677 scope.go:117] "RemoveContainer" containerID="9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1" Oct 07 13:38:21 crc kubenswrapper[4677]: E1007 13:38:21.985562 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1\": container with ID starting with 9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1 not found: ID does not exist" containerID="9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.985591 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1"} err="failed to get container status \"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1\": rpc error: code = NotFound desc = could not find container \"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1\": container with ID starting with 9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1 not found: ID does not exist" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.985608 4677 scope.go:117] "RemoveContainer" containerID="110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.985906 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a"} err="failed to get container status \"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a\": rpc error: code = NotFound desc = could not find container \"110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a\": container with ID starting with 110b2f63d547d3dd36d4010a8aa1c8a98f9faa65bddee7244f9e2d754401591a not found: ID does not exist" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.985929 4677 scope.go:117] "RemoveContainer" containerID="9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1" Oct 07 13:38:21 crc kubenswrapper[4677]: I1007 13:38:21.986234 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1"} err="failed to get container status \"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1\": rpc error: code = NotFound desc = could not find container \"9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1\": container with ID starting with 9701bf2bb18cb5a0dd935c25551e909002c36fdfad502d82e9612b582647ccb1 not found: ID does not exist" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.047131 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4kxd\" (UniqueName: \"kubernetes.io/projected/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-kube-api-access-f4kxd\") pod \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.047194 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-apiservice-cert\") pod \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.047217 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-webhook-cert\") pod \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\" (UID: \"97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e\") " Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.050753 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" (UID: "97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.051015 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" (UID: "97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.056376 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-kube-api-access-f4kxd" (OuterVolumeSpecName: "kube-api-access-f4kxd") pod "97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" (UID: "97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e"). InnerVolumeSpecName "kube-api-access-f4kxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.149260 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4kxd\" (UniqueName: \"kubernetes.io/projected/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-kube-api-access-f4kxd\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.149307 4677 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.149317 4677 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.177717 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.247354 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4"] Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.255699 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-586b5ff777-4p7n4"] Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.351480 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gss66\" (UniqueName: \"kubernetes.io/projected/3d440c3d-e4d7-4b91-87e7-0f73d587d638-kube-api-access-gss66\") pod \"3d440c3d-e4d7-4b91-87e7-0f73d587d638\" (UID: \"3d440c3d-e4d7-4b91-87e7-0f73d587d638\") " Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.354754 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d440c3d-e4d7-4b91-87e7-0f73d587d638-kube-api-access-gss66" (OuterVolumeSpecName: "kube-api-access-gss66") pod "3d440c3d-e4d7-4b91-87e7-0f73d587d638" (UID: "3d440c3d-e4d7-4b91-87e7-0f73d587d638"). InnerVolumeSpecName "kube-api-access-gss66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.454013 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gss66\" (UniqueName: \"kubernetes.io/projected/3d440c3d-e4d7-4b91-87e7-0f73d587d638-kube-api-access-gss66\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.929137 4677 generic.go:334] "Generic (PLEG): container finished" podID="3d440c3d-e4d7-4b91-87e7-0f73d587d638" containerID="108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703" exitCode=0 Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.929202 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjjn9" event={"ID":"3d440c3d-e4d7-4b91-87e7-0f73d587d638","Type":"ContainerDied","Data":"108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703"} Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.929232 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjjn9" event={"ID":"3d440c3d-e4d7-4b91-87e7-0f73d587d638","Type":"ContainerDied","Data":"c068648479cf839709652c29f903e53bc9e5c7ca1e9f657457acb461ff14b994"} Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.929227 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjjn9" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.929292 4677 scope.go:117] "RemoveContainer" containerID="108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.954684 4677 scope.go:117] "RemoveContainer" containerID="108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703" Oct 07 13:38:22 crc kubenswrapper[4677]: E1007 13:38:22.955224 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703\": container with ID starting with 108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703 not found: ID does not exist" containerID="108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.955261 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703"} err="failed to get container status \"108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703\": rpc error: code = NotFound desc = could not find container \"108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703\": container with ID starting with 108450553efb00f057badb300fdcfc77a8ab2ccf958931a17a1377bd0c6c4703 not found: ID does not exist" Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.962062 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-mjjn9"] Oct 07 13:38:22 crc kubenswrapper[4677]: I1007 13:38:22.967187 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-mjjn9"] Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.138473 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7"] Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.138777 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="manager" containerID="cri-o://544fb2e8b8509a0c4dfac25f799cad7c582e63dfe6cedf6652d38e359ad8ec15" gracePeriod=10 Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.138970 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="kube-rbac-proxy" containerID="cri-o://4eab8717e6659894ada0e2205510ff0edba5e637138f2090e1b917320033e79b" gracePeriod=10 Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.312613 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0182d27a-0a19-4496-a514-75b6ca93d5b7" path="/var/lib/kubelet/pods/0182d27a-0a19-4496-a514-75b6ca93d5b7/volumes" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.313359 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d440c3d-e4d7-4b91-87e7-0f73d587d638" path="/var/lib/kubelet/pods/3d440c3d-e4d7-4b91-87e7-0f73d587d638/volumes" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.313830 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" path="/var/lib/kubelet/pods/97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e/volumes" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.315074 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" path="/var/lib/kubelet/pods/bf5b66ae-3d1d-4299-bfe7-d3f3eb705177/volumes" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.402828 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-6fln4"] Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.403068 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-6fln4" podUID="32459eb8-0dbe-4046-9798-85e4cb9aca83" containerName="registry-server" containerID="cri-o://971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357" gracePeriod=30 Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.452237 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn"] Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.456198 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad89058jsn"] Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.841278 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.939193 4677 generic.go:334] "Generic (PLEG): container finished" podID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerID="4eab8717e6659894ada0e2205510ff0edba5e637138f2090e1b917320033e79b" exitCode=0 Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.939219 4677 generic.go:334] "Generic (PLEG): container finished" podID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerID="544fb2e8b8509a0c4dfac25f799cad7c582e63dfe6cedf6652d38e359ad8ec15" exitCode=0 Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.939253 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" event={"ID":"f1f08780-3bfe-4470-87d2-05bfaf5d89ce","Type":"ContainerDied","Data":"4eab8717e6659894ada0e2205510ff0edba5e637138f2090e1b917320033e79b"} Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.939281 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" event={"ID":"f1f08780-3bfe-4470-87d2-05bfaf5d89ce","Type":"ContainerDied","Data":"544fb2e8b8509a0c4dfac25f799cad7c582e63dfe6cedf6652d38e359ad8ec15"} Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.940116 4677 generic.go:334] "Generic (PLEG): container finished" podID="32459eb8-0dbe-4046-9798-85e4cb9aca83" containerID="971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357" exitCode=0 Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.940144 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-6fln4" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.940150 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-6fln4" event={"ID":"32459eb8-0dbe-4046-9798-85e4cb9aca83","Type":"ContainerDied","Data":"971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357"} Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.940208 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-6fln4" event={"ID":"32459eb8-0dbe-4046-9798-85e4cb9aca83","Type":"ContainerDied","Data":"461200fbe2f54ad933da07954839b23fb1fe9ada098eb0f323a584b80cef2b88"} Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.940233 4677 scope.go:117] "RemoveContainer" containerID="971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.952629 4677 scope.go:117] "RemoveContainer" containerID="971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357" Oct 07 13:38:23 crc kubenswrapper[4677]: E1007 13:38:23.953160 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357\": container with ID starting with 971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357 not found: ID does not exist" containerID="971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.953248 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357"} err="failed to get container status \"971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357\": rpc error: code = NotFound desc = could not find container \"971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357\": container with ID starting with 971ec747a4481d1eb22f421ff26f9931e447d8743f8a3f8bb6093e169acba357 not found: ID does not exist" Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.971586 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q9sm\" (UniqueName: \"kubernetes.io/projected/32459eb8-0dbe-4046-9798-85e4cb9aca83-kube-api-access-9q9sm\") pod \"32459eb8-0dbe-4046-9798-85e4cb9aca83\" (UID: \"32459eb8-0dbe-4046-9798-85e4cb9aca83\") " Oct 07 13:38:23 crc kubenswrapper[4677]: I1007 13:38:23.977005 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32459eb8-0dbe-4046-9798-85e4cb9aca83-kube-api-access-9q9sm" (OuterVolumeSpecName: "kube-api-access-9q9sm") pod "32459eb8-0dbe-4046-9798-85e4cb9aca83" (UID: "32459eb8-0dbe-4046-9798-85e4cb9aca83"). InnerVolumeSpecName "kube-api-access-9q9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.072965 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q9sm\" (UniqueName: \"kubernetes.io/projected/32459eb8-0dbe-4046-9798-85e4cb9aca83-kube-api-access-9q9sm\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.220703 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.283198 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-6fln4"] Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.286328 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-6fln4"] Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.380438 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-apiservice-cert\") pod \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.380495 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-webhook-cert\") pod \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.380537 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5gm4\" (UniqueName: \"kubernetes.io/projected/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-kube-api-access-n5gm4\") pod \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\" (UID: \"f1f08780-3bfe-4470-87d2-05bfaf5d89ce\") " Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.383809 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "f1f08780-3bfe-4470-87d2-05bfaf5d89ce" (UID: "f1f08780-3bfe-4470-87d2-05bfaf5d89ce"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.384067 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-kube-api-access-n5gm4" (OuterVolumeSpecName: "kube-api-access-n5gm4") pod "f1f08780-3bfe-4470-87d2-05bfaf5d89ce" (UID: "f1f08780-3bfe-4470-87d2-05bfaf5d89ce"). InnerVolumeSpecName "kube-api-access-n5gm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.399127 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "f1f08780-3bfe-4470-87d2-05bfaf5d89ce" (UID: "f1f08780-3bfe-4470-87d2-05bfaf5d89ce"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.482719 4677 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.482769 4677 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.482790 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5gm4\" (UniqueName: \"kubernetes.io/projected/f1f08780-3bfe-4470-87d2-05bfaf5d89ce-kube-api-access-n5gm4\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.953126 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.953115 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7" event={"ID":"f1f08780-3bfe-4470-87d2-05bfaf5d89ce","Type":"ContainerDied","Data":"8d84ae3d0dc7d359a9af69efc25bae9e038ec65a6a8d74dcb63263cd957ef120"} Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.953275 4677 scope.go:117] "RemoveContainer" containerID="4eab8717e6659894ada0e2205510ff0edba5e637138f2090e1b917320033e79b" Oct 07 13:38:24 crc kubenswrapper[4677]: I1007 13:38:24.979832 4677 scope.go:117] "RemoveContainer" containerID="544fb2e8b8509a0c4dfac25f799cad7c582e63dfe6cedf6652d38e359ad8ec15" Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.003263 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7"] Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.011062 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-847ff55875-g5gf7"] Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.312775 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f60c875-0061-46ee-b9b9-8d0dc56ad361" path="/var/lib/kubelet/pods/0f60c875-0061-46ee-b9b9-8d0dc56ad361/volumes" Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.313606 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32459eb8-0dbe-4046-9798-85e4cb9aca83" path="/var/lib/kubelet/pods/32459eb8-0dbe-4046-9798-85e4cb9aca83/volumes" Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.314243 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" path="/var/lib/kubelet/pods/f1f08780-3bfe-4470-87d2-05bfaf5d89ce/volumes" Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.672301 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc"] Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.672774 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" podUID="00ec0b84-feea-4d16-b2f7-8935555cee0d" containerName="operator" containerID="cri-o://27568c198345ee4c91033ba5248dec9e2d2b3334c12a0d195de3af20cb358955" gracePeriod=10 Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.941514 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-cxhtv"] Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.941753 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" podUID="b24052e2-c147-452e-bc01-0970fe195485" containerName="registry-server" containerID="cri-o://86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80" gracePeriod=30 Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.963162 4677 generic.go:334] "Generic (PLEG): container finished" podID="00ec0b84-feea-4d16-b2f7-8935555cee0d" containerID="27568c198345ee4c91033ba5248dec9e2d2b3334c12a0d195de3af20cb358955" exitCode=0 Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.963661 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" event={"ID":"00ec0b84-feea-4d16-b2f7-8935555cee0d","Type":"ContainerDied","Data":"27568c198345ee4c91033ba5248dec9e2d2b3334c12a0d195de3af20cb358955"} Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.982865 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp"] Oct 07 13:38:25 crc kubenswrapper[4677]: I1007 13:38:25.991977 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59026hnp"] Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.093884 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.201033 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbn5x\" (UniqueName: \"kubernetes.io/projected/00ec0b84-feea-4d16-b2f7-8935555cee0d-kube-api-access-rbn5x\") pod \"00ec0b84-feea-4d16-b2f7-8935555cee0d\" (UID: \"00ec0b84-feea-4d16-b2f7-8935555cee0d\") " Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.204503 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ec0b84-feea-4d16-b2f7-8935555cee0d-kube-api-access-rbn5x" (OuterVolumeSpecName: "kube-api-access-rbn5x") pod "00ec0b84-feea-4d16-b2f7-8935555cee0d" (UID: "00ec0b84-feea-4d16-b2f7-8935555cee0d"). InnerVolumeSpecName "kube-api-access-rbn5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.294353 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.303317 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbn5x\" (UniqueName: \"kubernetes.io/projected/00ec0b84-feea-4d16-b2f7-8935555cee0d-kube-api-access-rbn5x\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.404323 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr2fn\" (UniqueName: \"kubernetes.io/projected/b24052e2-c147-452e-bc01-0970fe195485-kube-api-access-rr2fn\") pod \"b24052e2-c147-452e-bc01-0970fe195485\" (UID: \"b24052e2-c147-452e-bc01-0970fe195485\") " Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.407599 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24052e2-c147-452e-bc01-0970fe195485-kube-api-access-rr2fn" (OuterVolumeSpecName: "kube-api-access-rr2fn") pod "b24052e2-c147-452e-bc01-0970fe195485" (UID: "b24052e2-c147-452e-bc01-0970fe195485"). InnerVolumeSpecName "kube-api-access-rr2fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.505223 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr2fn\" (UniqueName: \"kubernetes.io/projected/b24052e2-c147-452e-bc01-0970fe195485-kube-api-access-rr2fn\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.972163 4677 generic.go:334] "Generic (PLEG): container finished" podID="b24052e2-c147-452e-bc01-0970fe195485" containerID="86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80" exitCode=0 Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.972227 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.972226 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" event={"ID":"b24052e2-c147-452e-bc01-0970fe195485","Type":"ContainerDied","Data":"86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80"} Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.974336 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-cxhtv" event={"ID":"b24052e2-c147-452e-bc01-0970fe195485","Type":"ContainerDied","Data":"072eebb6c0a61c0d10ce4c2190a3f199c8347a203836b8e73a31fccd53380c8b"} Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.974412 4677 scope.go:117] "RemoveContainer" containerID="86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80" Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.975556 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" event={"ID":"00ec0b84-feea-4d16-b2f7-8935555cee0d","Type":"ContainerDied","Data":"c6d96d1afc4affe6ddbbb0d229ea04c07a51a1b840559889fac8391358f80ed7"} Oct 07 13:38:26 crc kubenswrapper[4677]: I1007 13:38:26.975586 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc" Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.000757 4677 scope.go:117] "RemoveContainer" containerID="86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80" Oct 07 13:38:27 crc kubenswrapper[4677]: E1007 13:38:27.004538 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80\": container with ID starting with 86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80 not found: ID does not exist" containerID="86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80" Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.004597 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80"} err="failed to get container status \"86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80\": rpc error: code = NotFound desc = could not find container \"86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80\": container with ID starting with 86e9665515dd78001f154b523bf7e079791485abdf5577c38262fd3a6616ce80 not found: ID does not exist" Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.004634 4677 scope.go:117] "RemoveContainer" containerID="27568c198345ee4c91033ba5248dec9e2d2b3334c12a0d195de3af20cb358955" Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.012276 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-cxhtv"] Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.028913 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-cxhtv"] Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.037817 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc"] Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.041671 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8z6bc"] Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.312253 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ec0b84-feea-4d16-b2f7-8935555cee0d" path="/var/lib/kubelet/pods/00ec0b84-feea-4d16-b2f7-8935555cee0d/volumes" Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.312986 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24052e2-c147-452e-bc01-0970fe195485" path="/var/lib/kubelet/pods/b24052e2-c147-452e-bc01-0970fe195485/volumes" Oct 07 13:38:27 crc kubenswrapper[4677]: I1007 13:38:27.313764 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6a4a8e-02b4-4ae3-8e66-77416eb060ae" path="/var/lib/kubelet/pods/ba6a4a8e-02b4-4ae3-8e66-77416eb060ae/volumes" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.302627 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.303166 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427260 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dtsxl/must-gather-nj4nl"] Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427624 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427657 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427680 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerName="mysql-bootstrap" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427690 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerName="mysql-bootstrap" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427711 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerName="setup-container" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427722 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerName="setup-container" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427739 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ee04d7-9071-415d-94cd-daaace73b138" containerName="keystone-api" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427749 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ee04d7-9071-415d-94cd-daaace73b138" containerName="keystone-api" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427767 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427776 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427789 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427801 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427814 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ec0b84-feea-4d16-b2f7-8935555cee0d" containerName="operator" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427825 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ec0b84-feea-4d16-b2f7-8935555cee0d" containerName="operator" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427838 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427849 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427865 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24052e2-c147-452e-bc01-0970fe195485" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427877 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24052e2-c147-452e-bc01-0970fe195485" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427894 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427905 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427923 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dabf1b-2f0f-4f7f-8342-31001928330b" containerName="memcached" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427934 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dabf1b-2f0f-4f7f-8342-31001928330b" containerName="memcached" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427950 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerName="rabbitmq" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427961 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerName="rabbitmq" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.427979 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.427991 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428005 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428016 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428029 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428039 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428055 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869e5c88-2679-4b90-b674-bba233dc88e0" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428066 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="869e5c88-2679-4b90-b674-bba233dc88e0" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428080 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428090 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428107 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32459eb8-0dbe-4046-9798-85e4cb9aca83" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428118 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="32459eb8-0dbe-4046-9798-85e4cb9aca83" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428131 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d440c3d-e4d7-4b91-87e7-0f73d587d638" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428142 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d440c3d-e4d7-4b91-87e7-0f73d587d638" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428159 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerName="mysql-bootstrap" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428170 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerName="mysql-bootstrap" Oct 07 13:38:34 crc kubenswrapper[4677]: E1007 13:38:34.428186 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerName="mysql-bootstrap" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428197 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerName="mysql-bootstrap" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428365 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b168b6a-df61-44ed-8a09-ed30d3ecc2ea" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428379 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ee04d7-9071-415d-94cd-daaace73b138" containerName="keystone-api" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428397 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ec0b84-feea-4d16-b2f7-8935555cee0d" containerName="operator" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428410 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="869e5c88-2679-4b90-b674-bba233dc88e0" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428421 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5b66ae-3d1d-4299-bfe7-d3f3eb705177" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428496 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428512 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec4d41c-e5c7-49c2-be5b-2b228d1ddcb8" containerName="galera" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428528 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d440c3d-e4d7-4b91-87e7-0f73d587d638" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428540 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f08780-3bfe-4470-87d2-05bfaf5d89ce" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428555 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428571 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="c50e7113-f37a-4ea1-9d53-c53106564a48" containerName="rabbitmq" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428585 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8dabf1b-2f0f-4f7f-8342-31001928330b" containerName="memcached" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428604 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="32459eb8-0dbe-4046-9798-85e4cb9aca83" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428621 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428636 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f6efbc-5f7a-48b6-ae09-35afb8f8dd7e" containerName="kube-rbac-proxy" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428649 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24052e2-c147-452e-bc01-0970fe195485" containerName="registry-server" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.428666 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1035cf-bd50-4425-a342-92a71ab7f16e" containerName="manager" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.429638 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.439630 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dtsxl"/"openshift-service-ca.crt" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.439756 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dtsxl"/"kube-root-ca.crt" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.452747 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtsxl/must-gather-nj4nl"] Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.612375 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d68b9d9-1520-4f23-9690-eb30cd87be84-must-gather-output\") pod \"must-gather-nj4nl\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.612457 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vwn2\" (UniqueName: \"kubernetes.io/projected/0d68b9d9-1520-4f23-9690-eb30cd87be84-kube-api-access-6vwn2\") pod \"must-gather-nj4nl\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.713623 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d68b9d9-1520-4f23-9690-eb30cd87be84-must-gather-output\") pod \"must-gather-nj4nl\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.713696 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vwn2\" (UniqueName: \"kubernetes.io/projected/0d68b9d9-1520-4f23-9690-eb30cd87be84-kube-api-access-6vwn2\") pod \"must-gather-nj4nl\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.714058 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d68b9d9-1520-4f23-9690-eb30cd87be84-must-gather-output\") pod \"must-gather-nj4nl\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.730488 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vwn2\" (UniqueName: \"kubernetes.io/projected/0d68b9d9-1520-4f23-9690-eb30cd87be84-kube-api-access-6vwn2\") pod \"must-gather-nj4nl\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:34 crc kubenswrapper[4677]: I1007 13:38:34.753278 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:38:35 crc kubenswrapper[4677]: I1007 13:38:35.123944 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtsxl/must-gather-nj4nl"] Oct 07 13:38:36 crc kubenswrapper[4677]: I1007 13:38:36.037064 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" event={"ID":"0d68b9d9-1520-4f23-9690-eb30cd87be84","Type":"ContainerStarted","Data":"19a689c3de36b1a5156a1d7cb03a5a49595aedcf70af46fa06b358a7e3f952de"} Oct 07 13:38:40 crc kubenswrapper[4677]: I1007 13:38:40.061853 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" event={"ID":"0d68b9d9-1520-4f23-9690-eb30cd87be84","Type":"ContainerStarted","Data":"c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0"} Oct 07 13:38:40 crc kubenswrapper[4677]: I1007 13:38:40.062350 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" event={"ID":"0d68b9d9-1520-4f23-9690-eb30cd87be84","Type":"ContainerStarted","Data":"89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906"} Oct 07 13:38:40 crc kubenswrapper[4677]: I1007 13:38:40.077361 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" podStartSLOduration=2.1864132769999998 podStartE2EDuration="6.077346679s" podCreationTimestamp="2025-10-07 13:38:34 +0000 UTC" firstStartedPulling="2025-10-07 13:38:35.129639224 +0000 UTC m=+1886.615348349" lastFinishedPulling="2025-10-07 13:38:39.020572586 +0000 UTC m=+1890.506281751" observedRunningTime="2025-10-07 13:38:40.075689181 +0000 UTC m=+1891.561398296" watchObservedRunningTime="2025-10-07 13:38:40.077346679 +0000 UTC m=+1891.563055794" Oct 07 13:38:48 crc kubenswrapper[4677]: I1007 13:38:48.302745 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:38:48 crc kubenswrapper[4677]: E1007 13:38:48.303367 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:38:59 crc kubenswrapper[4677]: I1007 13:38:59.306715 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:38:59 crc kubenswrapper[4677]: E1007 13:38:59.307632 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.400797 4677 scope.go:117] "RemoveContainer" containerID="92010fddc8e22ad080f25639af06d5e5bd9194ff47a8120801fed1c37f956f0d" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.419905 4677 scope.go:117] "RemoveContainer" containerID="e322569eb2b950c724b113ea499bc83fef690e7f5beb87e95037d3cfd0bdf326" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.445653 4677 scope.go:117] "RemoveContainer" containerID="17f2f6c42ca454551c29ad8f0e31600ecdd63b1c1cc2d590b728582faacdad31" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.469696 4677 scope.go:117] "RemoveContainer" containerID="a08ecfd8070995620f0b3356874651c81838f936e76b44ef22e5ac692f292f00" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.486281 4677 scope.go:117] "RemoveContainer" containerID="452b3bae6954682929a386fc3244997b62e463ca95f0a3e27a24a05b67cd9044" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.505570 4677 scope.go:117] "RemoveContainer" containerID="0375e1d9f3b6c4bb3acd12bcbc377800984e5d01f9cf5aeb7ce6f626e0654505" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.524928 4677 scope.go:117] "RemoveContainer" containerID="c7f1f68e0581e41d01275ddb5a9281dd1cc824611e6f7d5fd460d639b880d032" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.545602 4677 scope.go:117] "RemoveContainer" containerID="1dc3528812cc007c9d3af3b1e6991eeb10034ed58c9309bfe69e7109e5f0a843" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.564216 4677 scope.go:117] "RemoveContainer" containerID="c190424511409e29680335d531107ec5129222100e8f8ca39e9d77a8db08acf2" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.579850 4677 scope.go:117] "RemoveContainer" containerID="8f821c67dc0ae0b965267cb00001c219393d199afcf15b092b853968e37b488c" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.594802 4677 scope.go:117] "RemoveContainer" containerID="d1aa332bb0e5dac07aa681dedf35083ea91ca2bc6b8112e181d9836d8d1a6540" Oct 07 13:39:11 crc kubenswrapper[4677]: I1007 13:39:11.607927 4677 scope.go:117] "RemoveContainer" containerID="b19eb91cb9b2f2b118237689ff5b2659609146e9eece63852675b5958912403e" Oct 07 13:39:14 crc kubenswrapper[4677]: I1007 13:39:14.302924 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:39:14 crc kubenswrapper[4677]: E1007 13:39:14.303406 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:39:19 crc kubenswrapper[4677]: I1007 13:39:19.630039 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wklrf_2fef6870-fac6-49bc-8471-fb78198ba057/control-plane-machine-set-operator/0.log" Oct 07 13:39:19 crc kubenswrapper[4677]: I1007 13:39:19.756616 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b5qm4_5172e7b5-3ef1-4f51-8874-8d4ac858284b/kube-rbac-proxy/0.log" Oct 07 13:39:19 crc kubenswrapper[4677]: I1007 13:39:19.757993 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b5qm4_5172e7b5-3ef1-4f51-8874-8d4ac858284b/machine-api-operator/0.log" Oct 07 13:39:29 crc kubenswrapper[4677]: I1007 13:39:29.312614 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:39:29 crc kubenswrapper[4677]: E1007 13:39:29.313592 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:39:33 crc kubenswrapper[4677]: I1007 13:39:33.861469 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-kmjvc_fc03bc45-d39d-4302-839a-1f89960e640f/kube-rbac-proxy/0.log" Oct 07 13:39:33 crc kubenswrapper[4677]: I1007 13:39:33.917824 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-kmjvc_fc03bc45-d39d-4302-839a-1f89960e640f/controller/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.099850 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.234884 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.250469 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.261130 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.319610 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.431156 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.436698 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.443361 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.494792 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.658905 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.663421 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/controller/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.672661 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.694192 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.827559 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/frr-metrics/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.853589 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/kube-rbac-proxy/0.log" Oct 07 13:39:34 crc kubenswrapper[4677]: I1007 13:39:34.886598 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/kube-rbac-proxy-frr/0.log" Oct 07 13:39:35 crc kubenswrapper[4677]: I1007 13:39:35.104722 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-g8px2_90595934-7ad8-4e4d-b918-2f3e63d63e34/frr-k8s-webhook-server/0.log" Oct 07 13:39:35 crc kubenswrapper[4677]: I1007 13:39:35.107356 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/reloader/0.log" Oct 07 13:39:35 crc kubenswrapper[4677]: I1007 13:39:35.242951 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/frr/0.log" Oct 07 13:39:35 crc kubenswrapper[4677]: I1007 13:39:35.296499 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-859694fc4f-rbbdh_3c7d4a9a-028b-4131-8a8e-722259f8cd2c/manager/0.log" Oct 07 13:39:35 crc kubenswrapper[4677]: I1007 13:39:35.369214 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54787cf69c-lk2bh_407281a9-edd0-4bc9-bb53-dee866971f52/webhook-server/0.log" Oct 07 13:39:35 crc kubenswrapper[4677]: I1007 13:39:35.511427 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z6n69_6e40d031-a727-42d1-af91-4f20c3c10fef/kube-rbac-proxy/0.log" Oct 07 13:39:35 crc kubenswrapper[4677]: I1007 13:39:35.660938 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z6n69_6e40d031-a727-42d1-af91-4f20c3c10fef/speaker/0.log" Oct 07 13:39:43 crc kubenswrapper[4677]: I1007 13:39:43.303332 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:39:43 crc kubenswrapper[4677]: E1007 13:39:43.304142 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:39:54 crc kubenswrapper[4677]: I1007 13:39:54.303644 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:39:54 crc kubenswrapper[4677]: E1007 13:39:54.304888 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.109477 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/util/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.252315 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/pull/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.258300 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/util/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.264514 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/pull/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.390113 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/util/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.432827 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/extract/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.439699 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/pull/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.582756 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-utilities/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.728645 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-content/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.756823 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-utilities/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.757273 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-content/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.883151 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-utilities/0.log" Oct 07 13:39:58 crc kubenswrapper[4677]: I1007 13:39:58.907189 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-content/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.060800 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-utilities/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.165785 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/registry-server/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.243131 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-content/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.256642 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-content/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.292678 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-utilities/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.417317 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-utilities/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.453937 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-content/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.609277 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zbppr_c8872e7f-608d-4ade-8466-e1e743417ece/marketplace-operator/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.700755 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-utilities/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.738056 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/registry-server/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.883585 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-content/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.889621 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-utilities/0.log" Oct 07 13:39:59 crc kubenswrapper[4677]: I1007 13:39:59.916206 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-content/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.013703 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-utilities/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.074020 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-content/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.114823 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/registry-server/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.214662 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-utilities/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.365845 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-utilities/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.375546 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-content/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.380626 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-content/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.511991 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-content/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.558479 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-utilities/0.log" Oct 07 13:40:00 crc kubenswrapper[4677]: I1007 13:40:00.943073 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/registry-server/0.log" Oct 07 13:40:07 crc kubenswrapper[4677]: I1007 13:40:07.302970 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:40:07 crc kubenswrapper[4677]: E1007 13:40:07.303665 4677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r7cnz_openshift-machine-config-operator(7879fa59-a7cb-4d29-ba3a-c91f43bfcba6)\"" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" Oct 07 13:40:20 crc kubenswrapper[4677]: I1007 13:40:20.303560 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:40:20 crc kubenswrapper[4677]: I1007 13:40:20.725047 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"a3aa1af230f942f967cc0d97803787614843e98cff81e85c6c158a5f9d90198f"} Oct 07 13:40:59 crc kubenswrapper[4677]: I1007 13:40:59.964348 4677 generic.go:334] "Generic (PLEG): container finished" podID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerID="89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906" exitCode=0 Oct 07 13:40:59 crc kubenswrapper[4677]: I1007 13:40:59.964481 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" event={"ID":"0d68b9d9-1520-4f23-9690-eb30cd87be84","Type":"ContainerDied","Data":"89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906"} Oct 07 13:40:59 crc kubenswrapper[4677]: I1007 13:40:59.965371 4677 scope.go:117] "RemoveContainer" containerID="89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906" Oct 07 13:41:00 crc kubenswrapper[4677]: I1007 13:41:00.316406 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dtsxl_must-gather-nj4nl_0d68b9d9-1520-4f23-9690-eb30cd87be84/gather/0.log" Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.042536 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dtsxl/must-gather-nj4nl"] Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.043219 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerName="copy" containerID="cri-o://c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0" gracePeriod=2 Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.047869 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dtsxl/must-gather-nj4nl"] Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.565587 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dtsxl_must-gather-nj4nl_0d68b9d9-1520-4f23-9690-eb30cd87be84/copy/0.log" Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.565926 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.588624 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d68b9d9-1520-4f23-9690-eb30cd87be84-must-gather-output\") pod \"0d68b9d9-1520-4f23-9690-eb30cd87be84\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.588940 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vwn2\" (UniqueName: \"kubernetes.io/projected/0d68b9d9-1520-4f23-9690-eb30cd87be84-kube-api-access-6vwn2\") pod \"0d68b9d9-1520-4f23-9690-eb30cd87be84\" (UID: \"0d68b9d9-1520-4f23-9690-eb30cd87be84\") " Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.594894 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d68b9d9-1520-4f23-9690-eb30cd87be84-kube-api-access-6vwn2" (OuterVolumeSpecName: "kube-api-access-6vwn2") pod "0d68b9d9-1520-4f23-9690-eb30cd87be84" (UID: "0d68b9d9-1520-4f23-9690-eb30cd87be84"). InnerVolumeSpecName "kube-api-access-6vwn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.639121 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d68b9d9-1520-4f23-9690-eb30cd87be84-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0d68b9d9-1520-4f23-9690-eb30cd87be84" (UID: "0d68b9d9-1520-4f23-9690-eb30cd87be84"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.690838 4677 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d68b9d9-1520-4f23-9690-eb30cd87be84-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 13:41:07 crc kubenswrapper[4677]: I1007 13:41:07.690894 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vwn2\" (UniqueName: \"kubernetes.io/projected/0d68b9d9-1520-4f23-9690-eb30cd87be84-kube-api-access-6vwn2\") on node \"crc\" DevicePath \"\"" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.025963 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dtsxl_must-gather-nj4nl_0d68b9d9-1520-4f23-9690-eb30cd87be84/copy/0.log" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.026474 4677 generic.go:334] "Generic (PLEG): container finished" podID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerID="c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0" exitCode=143 Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.026537 4677 scope.go:117] "RemoveContainer" containerID="c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.026547 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtsxl/must-gather-nj4nl" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.048258 4677 scope.go:117] "RemoveContainer" containerID="89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.099320 4677 scope.go:117] "RemoveContainer" containerID="c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0" Oct 07 13:41:08 crc kubenswrapper[4677]: E1007 13:41:08.099877 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0\": container with ID starting with c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0 not found: ID does not exist" containerID="c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.099931 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0"} err="failed to get container status \"c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0\": rpc error: code = NotFound desc = could not find container \"c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0\": container with ID starting with c1ad74c7d16fc11086bc976b6111028528c18543bef9806f8d06815fc00445f0 not found: ID does not exist" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.099967 4677 scope.go:117] "RemoveContainer" containerID="89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906" Oct 07 13:41:08 crc kubenswrapper[4677]: E1007 13:41:08.100350 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906\": container with ID starting with 89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906 not found: ID does not exist" containerID="89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906" Oct 07 13:41:08 crc kubenswrapper[4677]: I1007 13:41:08.100406 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906"} err="failed to get container status \"89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906\": rpc error: code = NotFound desc = could not find container \"89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906\": container with ID starting with 89496c967f5373ddebe75e014ab13173a725b195b537372ac4dc7675ab1e6906 not found: ID does not exist" Oct 07 13:41:09 crc kubenswrapper[4677]: I1007 13:41:09.325781 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" path="/var/lib/kubelet/pods/0d68b9d9-1520-4f23-9690-eb30cd87be84/volumes" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.595591 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wr287/must-gather-4j4vh"] Oct 07 13:41:49 crc kubenswrapper[4677]: E1007 13:41:49.596396 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerName="gather" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.596411 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerName="gather" Oct 07 13:41:49 crc kubenswrapper[4677]: E1007 13:41:49.596455 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerName="copy" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.596465 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerName="copy" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.596614 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerName="gather" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.596630 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d68b9d9-1520-4f23-9690-eb30cd87be84" containerName="copy" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.597367 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.605155 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wr287"/"kube-root-ca.crt" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.605717 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wr287"/"openshift-service-ca.crt" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.622127 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wr287/must-gather-4j4vh"] Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.638665 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzmxv\" (UniqueName: \"kubernetes.io/projected/9703992d-5370-4c26-9afa-0e22239ab5e3-kube-api-access-tzmxv\") pod \"must-gather-4j4vh\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.638731 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9703992d-5370-4c26-9afa-0e22239ab5e3-must-gather-output\") pod \"must-gather-4j4vh\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.739462 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzmxv\" (UniqueName: \"kubernetes.io/projected/9703992d-5370-4c26-9afa-0e22239ab5e3-kube-api-access-tzmxv\") pod \"must-gather-4j4vh\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.739510 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9703992d-5370-4c26-9afa-0e22239ab5e3-must-gather-output\") pod \"must-gather-4j4vh\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.739935 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9703992d-5370-4c26-9afa-0e22239ab5e3-must-gather-output\") pod \"must-gather-4j4vh\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.765306 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzmxv\" (UniqueName: \"kubernetes.io/projected/9703992d-5370-4c26-9afa-0e22239ab5e3-kube-api-access-tzmxv\") pod \"must-gather-4j4vh\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:49 crc kubenswrapper[4677]: I1007 13:41:49.918302 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:41:50 crc kubenswrapper[4677]: I1007 13:41:50.333548 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wr287/must-gather-4j4vh"] Oct 07 13:41:51 crc kubenswrapper[4677]: I1007 13:41:51.285861 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr287/must-gather-4j4vh" event={"ID":"9703992d-5370-4c26-9afa-0e22239ab5e3","Type":"ContainerStarted","Data":"ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399"} Oct 07 13:41:51 crc kubenswrapper[4677]: I1007 13:41:51.286179 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr287/must-gather-4j4vh" event={"ID":"9703992d-5370-4c26-9afa-0e22239ab5e3","Type":"ContainerStarted","Data":"fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a"} Oct 07 13:41:51 crc kubenswrapper[4677]: I1007 13:41:51.286194 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr287/must-gather-4j4vh" event={"ID":"9703992d-5370-4c26-9afa-0e22239ab5e3","Type":"ContainerStarted","Data":"34d8371939e2c4b4df43e7052fd09d47f3c2ca36c2d70406b74c24b248532e7b"} Oct 07 13:41:51 crc kubenswrapper[4677]: I1007 13:41:51.299262 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wr287/must-gather-4j4vh" podStartSLOduration=2.299240674 podStartE2EDuration="2.299240674s" podCreationTimestamp="2025-10-07 13:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:41:51.298184593 +0000 UTC m=+2082.783893728" watchObservedRunningTime="2025-10-07 13:41:51.299240674 +0000 UTC m=+2082.784949789" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.042195 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z2scz"] Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.043997 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.055099 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2scz"] Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.240106 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-catalog-content\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.240166 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfms\" (UniqueName: \"kubernetes.io/projected/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-kube-api-access-kcfms\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.240196 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-utilities\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.341680 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-catalog-content\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.341747 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfms\" (UniqueName: \"kubernetes.io/projected/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-kube-api-access-kcfms\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.341788 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-utilities\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.342193 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-catalog-content\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.342276 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-utilities\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.362468 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfms\" (UniqueName: \"kubernetes.io/projected/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-kube-api-access-kcfms\") pod \"redhat-operators-z2scz\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.363271 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:41:58 crc kubenswrapper[4677]: I1007 13:41:58.571763 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2scz"] Oct 07 13:41:59 crc kubenswrapper[4677]: I1007 13:41:59.326894 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2scz" event={"ID":"37e4e5a2-3646-4573-a3cf-0212f31dc9d4","Type":"ContainerStarted","Data":"b7250bf47072398678f7c9033b72b5312ef1f9af1e275b4322654875c0918b9c"} Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.242809 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gp9b6"] Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.244069 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.257779 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp9b6"] Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.334043 4677 generic.go:334] "Generic (PLEG): container finished" podID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerID="bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d" exitCode=0 Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.334085 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2scz" event={"ID":"37e4e5a2-3646-4573-a3cf-0212f31dc9d4","Type":"ContainerDied","Data":"bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d"} Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.335757 4677 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.367114 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-utilities\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.367292 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-catalog-content\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.367333 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6zwd\" (UniqueName: \"kubernetes.io/projected/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-kube-api-access-v6zwd\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.445662 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j4n7c"] Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.447014 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.462041 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j4n7c"] Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.468068 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6zwd\" (UniqueName: \"kubernetes.io/projected/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-kube-api-access-v6zwd\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.468144 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-utilities\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.468509 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-catalog-content\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.468813 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-utilities\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.468825 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-catalog-content\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.486716 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6zwd\" (UniqueName: \"kubernetes.io/projected/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-kube-api-access-v6zwd\") pod \"certified-operators-gp9b6\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.558797 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.569931 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-utilities\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.570165 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-catalog-content\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.570287 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c57nz\" (UniqueName: \"kubernetes.io/projected/03f37fe1-d7e4-4f42-b103-aa783f240b65-kube-api-access-c57nz\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.672084 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-utilities\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.672133 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-catalog-content\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.672215 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c57nz\" (UniqueName: \"kubernetes.io/projected/03f37fe1-d7e4-4f42-b103-aa783f240b65-kube-api-access-c57nz\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.672533 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-utilities\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.672808 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-catalog-content\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.700324 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c57nz\" (UniqueName: \"kubernetes.io/projected/03f37fe1-d7e4-4f42-b103-aa783f240b65-kube-api-access-c57nz\") pod \"community-operators-j4n7c\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:00 crc kubenswrapper[4677]: I1007 13:42:00.761053 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:01 crc kubenswrapper[4677]: I1007 13:42:01.004521 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j4n7c"] Oct 07 13:42:01 crc kubenswrapper[4677]: I1007 13:42:01.046629 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp9b6"] Oct 07 13:42:01 crc kubenswrapper[4677]: W1007 13:42:01.067013 4677 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad3b0e3_5c0f_43f8_98ec_cae1e66b1cea.slice/crio-b44773d26b01c1f81743665682461a17796e090c13e9f1903cd42a05f6322b58 WatchSource:0}: Error finding container b44773d26b01c1f81743665682461a17796e090c13e9f1903cd42a05f6322b58: Status 404 returned error can't find the container with id b44773d26b01c1f81743665682461a17796e090c13e9f1903cd42a05f6322b58 Oct 07 13:42:01 crc kubenswrapper[4677]: I1007 13:42:01.341936 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerStarted","Data":"9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61"} Oct 07 13:42:01 crc kubenswrapper[4677]: I1007 13:42:01.342014 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerStarted","Data":"50f9e8545676e7193e1f1693b8f6e06ef31cbc5060984c8f79e7db7864988a75"} Oct 07 13:42:01 crc kubenswrapper[4677]: I1007 13:42:01.343188 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9b6" event={"ID":"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea","Type":"ContainerStarted","Data":"b44773d26b01c1f81743665682461a17796e090c13e9f1903cd42a05f6322b58"} Oct 07 13:42:02 crc kubenswrapper[4677]: I1007 13:42:02.353927 4677 generic.go:334] "Generic (PLEG): container finished" podID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerID="9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61" exitCode=0 Oct 07 13:42:02 crc kubenswrapper[4677]: I1007 13:42:02.354013 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerDied","Data":"9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61"} Oct 07 13:42:02 crc kubenswrapper[4677]: I1007 13:42:02.356593 4677 generic.go:334] "Generic (PLEG): container finished" podID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerID="272185fdfc16ebea4a479c72f95eb29d1c7a3634a0a8947473bb98f91bdae90f" exitCode=0 Oct 07 13:42:02 crc kubenswrapper[4677]: I1007 13:42:02.356694 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9b6" event={"ID":"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea","Type":"ContainerDied","Data":"272185fdfc16ebea4a479c72f95eb29d1c7a3634a0a8947473bb98f91bdae90f"} Oct 07 13:42:03 crc kubenswrapper[4677]: I1007 13:42:03.365467 4677 generic.go:334] "Generic (PLEG): container finished" podID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerID="6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7" exitCode=0 Oct 07 13:42:03 crc kubenswrapper[4677]: I1007 13:42:03.365510 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2scz" event={"ID":"37e4e5a2-3646-4573-a3cf-0212f31dc9d4","Type":"ContainerDied","Data":"6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7"} Oct 07 13:42:04 crc kubenswrapper[4677]: I1007 13:42:04.373959 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerStarted","Data":"8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43"} Oct 07 13:42:05 crc kubenswrapper[4677]: I1007 13:42:05.381767 4677 generic.go:334] "Generic (PLEG): container finished" podID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerID="8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43" exitCode=0 Oct 07 13:42:05 crc kubenswrapper[4677]: I1007 13:42:05.381847 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerDied","Data":"8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43"} Oct 07 13:42:05 crc kubenswrapper[4677]: I1007 13:42:05.384790 4677 generic.go:334] "Generic (PLEG): container finished" podID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerID="83c47c0d42c6bf1c47caed5faeb3f6cebf3c8e6c0cec10b9bdf4b2bd38900255" exitCode=0 Oct 07 13:42:05 crc kubenswrapper[4677]: I1007 13:42:05.384858 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9b6" event={"ID":"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea","Type":"ContainerDied","Data":"83c47c0d42c6bf1c47caed5faeb3f6cebf3c8e6c0cec10b9bdf4b2bd38900255"} Oct 07 13:42:05 crc kubenswrapper[4677]: I1007 13:42:05.387899 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2scz" event={"ID":"37e4e5a2-3646-4573-a3cf-0212f31dc9d4","Type":"ContainerStarted","Data":"8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc"} Oct 07 13:42:05 crc kubenswrapper[4677]: I1007 13:42:05.438275 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z2scz" podStartSLOduration=3.074540805 podStartE2EDuration="7.438253178s" podCreationTimestamp="2025-10-07 13:41:58 +0000 UTC" firstStartedPulling="2025-10-07 13:42:00.335536895 +0000 UTC m=+2091.821246010" lastFinishedPulling="2025-10-07 13:42:04.699249268 +0000 UTC m=+2096.184958383" observedRunningTime="2025-10-07 13:42:05.434241492 +0000 UTC m=+2096.919950607" watchObservedRunningTime="2025-10-07 13:42:05.438253178 +0000 UTC m=+2096.923962313" Oct 07 13:42:08 crc kubenswrapper[4677]: I1007 13:42:08.363589 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:42:08 crc kubenswrapper[4677]: I1007 13:42:08.363938 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:42:08 crc kubenswrapper[4677]: I1007 13:42:08.405382 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9b6" event={"ID":"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea","Type":"ContainerStarted","Data":"50c9bb01ce4cc1eb11eb431210327197d94ca03753054cbfee1403445da70fab"} Oct 07 13:42:08 crc kubenswrapper[4677]: I1007 13:42:08.425603 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gp9b6" podStartSLOduration=3.781621891 podStartE2EDuration="8.425586705s" podCreationTimestamp="2025-10-07 13:42:00 +0000 UTC" firstStartedPulling="2025-10-07 13:42:02.358304516 +0000 UTC m=+2093.844013651" lastFinishedPulling="2025-10-07 13:42:07.00226933 +0000 UTC m=+2098.487978465" observedRunningTime="2025-10-07 13:42:08.423231848 +0000 UTC m=+2099.908940963" watchObservedRunningTime="2025-10-07 13:42:08.425586705 +0000 UTC m=+2099.911295820" Oct 07 13:42:09 crc kubenswrapper[4677]: I1007 13:42:09.400335 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z2scz" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="registry-server" probeResult="failure" output=< Oct 07 13:42:09 crc kubenswrapper[4677]: timeout: failed to connect service ":50051" within 1s Oct 07 13:42:09 crc kubenswrapper[4677]: > Oct 07 13:42:10 crc kubenswrapper[4677]: I1007 13:42:10.420043 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerStarted","Data":"aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f"} Oct 07 13:42:10 crc kubenswrapper[4677]: I1007 13:42:10.440717 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j4n7c" podStartSLOduration=4.74099195 podStartE2EDuration="10.440701255s" podCreationTimestamp="2025-10-07 13:42:00 +0000 UTC" firstStartedPulling="2025-10-07 13:42:02.356491144 +0000 UTC m=+2093.842200269" lastFinishedPulling="2025-10-07 13:42:08.056200469 +0000 UTC m=+2099.541909574" observedRunningTime="2025-10-07 13:42:10.437011809 +0000 UTC m=+2101.922720954" watchObservedRunningTime="2025-10-07 13:42:10.440701255 +0000 UTC m=+2101.926410370" Oct 07 13:42:10 crc kubenswrapper[4677]: I1007 13:42:10.560702 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:10 crc kubenswrapper[4677]: I1007 13:42:10.561001 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:10 crc kubenswrapper[4677]: I1007 13:42:10.607707 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:10 crc kubenswrapper[4677]: I1007 13:42:10.761877 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:10 crc kubenswrapper[4677]: I1007 13:42:10.762123 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:11 crc kubenswrapper[4677]: I1007 13:42:11.815201 4677 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-j4n7c" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="registry-server" probeResult="failure" output=< Oct 07 13:42:11 crc kubenswrapper[4677]: timeout: failed to connect service ":50051" within 1s Oct 07 13:42:11 crc kubenswrapper[4677]: > Oct 07 13:42:12 crc kubenswrapper[4677]: I1007 13:42:12.500588 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:13 crc kubenswrapper[4677]: I1007 13:42:13.634017 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp9b6"] Oct 07 13:42:14 crc kubenswrapper[4677]: I1007 13:42:14.462560 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gp9b6" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="registry-server" containerID="cri-o://50c9bb01ce4cc1eb11eb431210327197d94ca03753054cbfee1403445da70fab" gracePeriod=2 Oct 07 13:42:17 crc kubenswrapper[4677]: I1007 13:42:17.480565 4677 generic.go:334] "Generic (PLEG): container finished" podID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerID="50c9bb01ce4cc1eb11eb431210327197d94ca03753054cbfee1403445da70fab" exitCode=0 Oct 07 13:42:17 crc kubenswrapper[4677]: I1007 13:42:17.480649 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9b6" event={"ID":"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea","Type":"ContainerDied","Data":"50c9bb01ce4cc1eb11eb431210327197d94ca03753054cbfee1403445da70fab"} Oct 07 13:42:18 crc kubenswrapper[4677]: I1007 13:42:18.428008 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:42:18 crc kubenswrapper[4677]: I1007 13:42:18.474624 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:42:18 crc kubenswrapper[4677]: I1007 13:42:18.662807 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z2scz"] Oct 07 13:42:18 crc kubenswrapper[4677]: I1007 13:42:18.921171 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.119858 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6zwd\" (UniqueName: \"kubernetes.io/projected/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-kube-api-access-v6zwd\") pod \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.120308 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-utilities\") pod \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.120392 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-catalog-content\") pod \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\" (UID: \"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea\") " Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.121099 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-utilities" (OuterVolumeSpecName: "utilities") pod "fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" (UID: "fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.129729 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-kube-api-access-v6zwd" (OuterVolumeSpecName: "kube-api-access-v6zwd") pod "fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" (UID: "fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea"). InnerVolumeSpecName "kube-api-access-v6zwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.165361 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" (UID: "fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.221533 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.221573 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6zwd\" (UniqueName: \"kubernetes.io/projected/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-kube-api-access-v6zwd\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.221583 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.497277 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9b6" event={"ID":"fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea","Type":"ContainerDied","Data":"b44773d26b01c1f81743665682461a17796e090c13e9f1903cd42a05f6322b58"} Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.497308 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9b6" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.497329 4677 scope.go:117] "RemoveContainer" containerID="50c9bb01ce4cc1eb11eb431210327197d94ca03753054cbfee1403445da70fab" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.497445 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z2scz" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="registry-server" containerID="cri-o://8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc" gracePeriod=2 Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.517587 4677 scope.go:117] "RemoveContainer" containerID="83c47c0d42c6bf1c47caed5faeb3f6cebf3c8e6c0cec10b9bdf4b2bd38900255" Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.522565 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp9b6"] Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.526463 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gp9b6"] Oct 07 13:42:19 crc kubenswrapper[4677]: I1007 13:42:19.545633 4677 scope.go:117] "RemoveContainer" containerID="272185fdfc16ebea4a479c72f95eb29d1c7a3634a0a8947473bb98f91bdae90f" Oct 07 13:42:20 crc kubenswrapper[4677]: I1007 13:42:20.809768 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:20 crc kubenswrapper[4677]: I1007 13:42:20.872216 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:21 crc kubenswrapper[4677]: I1007 13:42:21.310017 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" path="/var/lib/kubelet/pods/fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea/volumes" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.372851 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.517649 4677 generic.go:334] "Generic (PLEG): container finished" podID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerID="8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc" exitCode=0 Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.517711 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2scz" event={"ID":"37e4e5a2-3646-4573-a3cf-0212f31dc9d4","Type":"ContainerDied","Data":"8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc"} Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.517728 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2scz" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.517752 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2scz" event={"ID":"37e4e5a2-3646-4573-a3cf-0212f31dc9d4","Type":"ContainerDied","Data":"b7250bf47072398678f7c9033b72b5312ef1f9af1e275b4322654875c0918b9c"} Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.517772 4677 scope.go:117] "RemoveContainer" containerID="8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.536330 4677 scope.go:117] "RemoveContainer" containerID="6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.554016 4677 scope.go:117] "RemoveContainer" containerID="bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.570766 4677 scope.go:117] "RemoveContainer" containerID="8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc" Oct 07 13:42:22 crc kubenswrapper[4677]: E1007 13:42:22.571584 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc\": container with ID starting with 8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc not found: ID does not exist" containerID="8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.571634 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc"} err="failed to get container status \"8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc\": rpc error: code = NotFound desc = could not find container \"8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc\": container with ID starting with 8a5ca798422afe234554f444caf3b0063ea32dd379e540c7b7dc7769cad8fdfc not found: ID does not exist" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.571671 4677 scope.go:117] "RemoveContainer" containerID="6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7" Oct 07 13:42:22 crc kubenswrapper[4677]: E1007 13:42:22.572147 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7\": container with ID starting with 6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7 not found: ID does not exist" containerID="6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.572186 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7"} err="failed to get container status \"6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7\": rpc error: code = NotFound desc = could not find container \"6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7\": container with ID starting with 6effb6df005653d2c34d573684ddbb56c31fb7b51b3e3e2a98ac8cdd81e82ee7 not found: ID does not exist" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.572208 4677 scope.go:117] "RemoveContainer" containerID="bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.572382 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-utilities\") pod \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.572423 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcfms\" (UniqueName: \"kubernetes.io/projected/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-kube-api-access-kcfms\") pod \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.572508 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-catalog-content\") pod \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\" (UID: \"37e4e5a2-3646-4573-a3cf-0212f31dc9d4\") " Oct 07 13:42:22 crc kubenswrapper[4677]: E1007 13:42:22.573282 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d\": container with ID starting with bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d not found: ID does not exist" containerID="bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.573331 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d"} err="failed to get container status \"bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d\": rpc error: code = NotFound desc = could not find container \"bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d\": container with ID starting with bf7bebfc45e1ce944dbf6422728475a297669e7ff016c3916975f42ffa7bca1d not found: ID does not exist" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.573665 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-utilities" (OuterVolumeSpecName: "utilities") pod "37e4e5a2-3646-4573-a3cf-0212f31dc9d4" (UID: "37e4e5a2-3646-4573-a3cf-0212f31dc9d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.585603 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-kube-api-access-kcfms" (OuterVolumeSpecName: "kube-api-access-kcfms") pod "37e4e5a2-3646-4573-a3cf-0212f31dc9d4" (UID: "37e4e5a2-3646-4573-a3cf-0212f31dc9d4"). InnerVolumeSpecName "kube-api-access-kcfms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.660837 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37e4e5a2-3646-4573-a3cf-0212f31dc9d4" (UID: "37e4e5a2-3646-4573-a3cf-0212f31dc9d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.676297 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.676364 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.676376 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcfms\" (UniqueName: \"kubernetes.io/projected/37e4e5a2-3646-4573-a3cf-0212f31dc9d4-kube-api-access-kcfms\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.843633 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z2scz"] Oct 07 13:42:22 crc kubenswrapper[4677]: I1007 13:42:22.846833 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z2scz"] Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.061912 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j4n7c"] Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.062184 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j4n7c" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="registry-server" containerID="cri-o://aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f" gracePeriod=2 Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.320732 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" path="/var/lib/kubelet/pods/37e4e5a2-3646-4573-a3cf-0212f31dc9d4/volumes" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.456625 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.485003 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-utilities\") pod \"03f37fe1-d7e4-4f42-b103-aa783f240b65\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.485102 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c57nz\" (UniqueName: \"kubernetes.io/projected/03f37fe1-d7e4-4f42-b103-aa783f240b65-kube-api-access-c57nz\") pod \"03f37fe1-d7e4-4f42-b103-aa783f240b65\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.485133 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-catalog-content\") pod \"03f37fe1-d7e4-4f42-b103-aa783f240b65\" (UID: \"03f37fe1-d7e4-4f42-b103-aa783f240b65\") " Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.486153 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-utilities" (OuterVolumeSpecName: "utilities") pod "03f37fe1-d7e4-4f42-b103-aa783f240b65" (UID: "03f37fe1-d7e4-4f42-b103-aa783f240b65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.495739 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f37fe1-d7e4-4f42-b103-aa783f240b65-kube-api-access-c57nz" (OuterVolumeSpecName: "kube-api-access-c57nz") pod "03f37fe1-d7e4-4f42-b103-aa783f240b65" (UID: "03f37fe1-d7e4-4f42-b103-aa783f240b65"). InnerVolumeSpecName "kube-api-access-c57nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.540085 4677 generic.go:334] "Generic (PLEG): container finished" podID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerID="aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f" exitCode=0 Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.540186 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n7c" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.540185 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerDied","Data":"aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f"} Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.540341 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n7c" event={"ID":"03f37fe1-d7e4-4f42-b103-aa783f240b65","Type":"ContainerDied","Data":"50f9e8545676e7193e1f1693b8f6e06ef31cbc5060984c8f79e7db7864988a75"} Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.540371 4677 scope.go:117] "RemoveContainer" containerID="aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.544862 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03f37fe1-d7e4-4f42-b103-aa783f240b65" (UID: "03f37fe1-d7e4-4f42-b103-aa783f240b65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.557958 4677 scope.go:117] "RemoveContainer" containerID="8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.582486 4677 scope.go:117] "RemoveContainer" containerID="9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.586073 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c57nz\" (UniqueName: \"kubernetes.io/projected/03f37fe1-d7e4-4f42-b103-aa783f240b65-kube-api-access-c57nz\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.586208 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.586285 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f37fe1-d7e4-4f42-b103-aa783f240b65-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.601584 4677 scope.go:117] "RemoveContainer" containerID="aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f" Oct 07 13:42:23 crc kubenswrapper[4677]: E1007 13:42:23.602091 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f\": container with ID starting with aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f not found: ID does not exist" containerID="aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.602148 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f"} err="failed to get container status \"aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f\": rpc error: code = NotFound desc = could not find container \"aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f\": container with ID starting with aa3b76c4bf559b2ed3b3da0a9884e02ff6c40800c174bdf0d88be48f5a56c34f not found: ID does not exist" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.602183 4677 scope.go:117] "RemoveContainer" containerID="8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43" Oct 07 13:42:23 crc kubenswrapper[4677]: E1007 13:42:23.602448 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43\": container with ID starting with 8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43 not found: ID does not exist" containerID="8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.602470 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43"} err="failed to get container status \"8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43\": rpc error: code = NotFound desc = could not find container \"8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43\": container with ID starting with 8d0526f21a273ebb260346d8293f6ea9c83d5855ca0d085c1876cb5c5e5f3f43 not found: ID does not exist" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.602482 4677 scope.go:117] "RemoveContainer" containerID="9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61" Oct 07 13:42:23 crc kubenswrapper[4677]: E1007 13:42:23.602770 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61\": container with ID starting with 9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61 not found: ID does not exist" containerID="9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.602788 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61"} err="failed to get container status \"9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61\": rpc error: code = NotFound desc = could not find container \"9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61\": container with ID starting with 9de82f601ca031f638715783378334690c954540a7f99b1b0ff4289f3ccf9e61 not found: ID does not exist" Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.867756 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j4n7c"] Oct 07 13:42:23 crc kubenswrapper[4677]: I1007 13:42:23.872046 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j4n7c"] Oct 07 13:42:25 crc kubenswrapper[4677]: I1007 13:42:25.311120 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" path="/var/lib/kubelet/pods/03f37fe1-d7e4-4f42-b103-aa783f240b65/volumes" Oct 07 13:42:31 crc kubenswrapper[4677]: I1007 13:42:31.079131 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wklrf_2fef6870-fac6-49bc-8471-fb78198ba057/control-plane-machine-set-operator/0.log" Oct 07 13:42:31 crc kubenswrapper[4677]: I1007 13:42:31.202609 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b5qm4_5172e7b5-3ef1-4f51-8874-8d4ac858284b/kube-rbac-proxy/0.log" Oct 07 13:42:31 crc kubenswrapper[4677]: I1007 13:42:31.247215 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b5qm4_5172e7b5-3ef1-4f51-8874-8d4ac858284b/machine-api-operator/0.log" Oct 07 13:42:40 crc kubenswrapper[4677]: I1007 13:42:40.918170 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:42:40 crc kubenswrapper[4677]: I1007 13:42:40.918752 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:42:44 crc kubenswrapper[4677]: I1007 13:42:44.696940 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-kmjvc_fc03bc45-d39d-4302-839a-1f89960e640f/controller/0.log" Oct 07 13:42:44 crc kubenswrapper[4677]: I1007 13:42:44.697662 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-kmjvc_fc03bc45-d39d-4302-839a-1f89960e640f/kube-rbac-proxy/0.log" Oct 07 13:42:44 crc kubenswrapper[4677]: I1007 13:42:44.835002 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:42:44 crc kubenswrapper[4677]: I1007 13:42:44.995519 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.004625 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.020764 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.074030 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.204630 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.216669 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.242228 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.247652 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.402284 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-frr-files/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.410685 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-metrics/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.412093 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/cp-reloader/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.438590 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/controller/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.578727 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/frr-metrics/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.597185 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/kube-rbac-proxy/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.658753 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/kube-rbac-proxy-frr/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.788123 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/reloader/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.827751 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-g8px2_90595934-7ad8-4e4d-b918-2f3e63d63e34/frr-k8s-webhook-server/0.log" Oct 07 13:42:45 crc kubenswrapper[4677]: I1007 13:42:45.991654 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-859694fc4f-rbbdh_3c7d4a9a-028b-4131-8a8e-722259f8cd2c/manager/0.log" Oct 07 13:42:46 crc kubenswrapper[4677]: I1007 13:42:46.072233 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq5q5_c6f0a46e-3591-4f18-9ff7-867b546b2273/frr/0.log" Oct 07 13:42:46 crc kubenswrapper[4677]: I1007 13:42:46.160026 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54787cf69c-lk2bh_407281a9-edd0-4bc9-bb53-dee866971f52/webhook-server/0.log" Oct 07 13:42:46 crc kubenswrapper[4677]: I1007 13:42:46.173929 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z6n69_6e40d031-a727-42d1-af91-4f20c3c10fef/kube-rbac-proxy/0.log" Oct 07 13:42:46 crc kubenswrapper[4677]: I1007 13:42:46.316820 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z6n69_6e40d031-a727-42d1-af91-4f20c3c10fef/speaker/0.log" Oct 07 13:43:07 crc kubenswrapper[4677]: I1007 13:43:07.863761 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/util/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.017587 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/pull/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.045173 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/util/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.051531 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/pull/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.185387 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/util/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.214307 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/pull/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.219883 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xjkqz_987fd203-f583-44fe-b845-d33510d6bb30/extract/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.344636 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-utilities/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.492964 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-content/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.512565 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-content/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.528991 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-utilities/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.672011 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-utilities/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.724228 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/extract-content/0.log" Oct 07 13:43:08 crc kubenswrapper[4677]: I1007 13:43:08.888023 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-utilities/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.043800 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pjgk2_a97dd5c5-2e20-4155-8829-acd24af4fe9f/registry-server/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.058693 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-content/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.101060 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-utilities/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.122685 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-content/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.221894 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-utilities/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.230150 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/extract-content/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.455356 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zbppr_c8872e7f-608d-4ade-8466-e1e743417ece/marketplace-operator/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.559558 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-utilities/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.650537 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n428h_ad380640-a5ab-4fb9-838b-28b4732c597e/registry-server/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.704572 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-content/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.762482 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-content/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.768482 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-utilities/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.900760 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-utilities/0.log" Oct 07 13:43:09 crc kubenswrapper[4677]: I1007 13:43:09.905715 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/extract-content/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.023843 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r7528_422fbc9b-de01-4f7f-8be7-0036569e6dbb/registry-server/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.072910 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-utilities/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.260864 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-utilities/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.267584 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-content/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.288277 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-content/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.460194 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-utilities/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.475699 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/extract-content/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.868472 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pm2jz_e628b85d-09f9-4559-9853-b35c19a0e0e6/registry-server/0.log" Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.917233 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:43:10 crc kubenswrapper[4677]: I1007 13:43:10.917305 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:43:40 crc kubenswrapper[4677]: I1007 13:43:40.917897 4677 patch_prober.go:28] interesting pod/machine-config-daemon-r7cnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:43:40 crc kubenswrapper[4677]: I1007 13:43:40.918718 4677 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:43:40 crc kubenswrapper[4677]: I1007 13:43:40.918819 4677 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" Oct 07 13:43:40 crc kubenswrapper[4677]: I1007 13:43:40.919894 4677 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3aa1af230f942f967cc0d97803787614843e98cff81e85c6c158a5f9d90198f"} pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:43:40 crc kubenswrapper[4677]: I1007 13:43:40.919998 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" podUID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerName="machine-config-daemon" containerID="cri-o://a3aa1af230f942f967cc0d97803787614843e98cff81e85c6c158a5f9d90198f" gracePeriod=600 Oct 07 13:43:41 crc kubenswrapper[4677]: I1007 13:43:41.993704 4677 generic.go:334] "Generic (PLEG): container finished" podID="7879fa59-a7cb-4d29-ba3a-c91f43bfcba6" containerID="a3aa1af230f942f967cc0d97803787614843e98cff81e85c6c158a5f9d90198f" exitCode=0 Oct 07 13:43:41 crc kubenswrapper[4677]: I1007 13:43:41.994281 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerDied","Data":"a3aa1af230f942f967cc0d97803787614843e98cff81e85c6c158a5f9d90198f"} Oct 07 13:43:41 crc kubenswrapper[4677]: I1007 13:43:41.994314 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r7cnz" event={"ID":"7879fa59-a7cb-4d29-ba3a-c91f43bfcba6","Type":"ContainerStarted","Data":"6855a0e286c59553f328390439c0f328eef3a1baa5c99e54013ca3a6d0864286"} Oct 07 13:43:41 crc kubenswrapper[4677]: I1007 13:43:41.994336 4677 scope.go:117] "RemoveContainer" containerID="9c395efdf36cc3142121da31bd43ed02b4d941c9bf27ad091d7892140334598b" Oct 07 13:44:10 crc kubenswrapper[4677]: I1007 13:44:10.205187 4677 generic.go:334] "Generic (PLEG): container finished" podID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerID="fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a" exitCode=0 Oct 07 13:44:10 crc kubenswrapper[4677]: I1007 13:44:10.205281 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr287/must-gather-4j4vh" event={"ID":"9703992d-5370-4c26-9afa-0e22239ab5e3","Type":"ContainerDied","Data":"fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a"} Oct 07 13:44:10 crc kubenswrapper[4677]: I1007 13:44:10.206181 4677 scope.go:117] "RemoveContainer" containerID="fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a" Oct 07 13:44:11 crc kubenswrapper[4677]: I1007 13:44:11.062347 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wr287_must-gather-4j4vh_9703992d-5370-4c26-9afa-0e22239ab5e3/gather/0.log" Oct 07 13:44:19 crc kubenswrapper[4677]: I1007 13:44:19.814653 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wr287/must-gather-4j4vh"] Oct 07 13:44:19 crc kubenswrapper[4677]: I1007 13:44:19.815858 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wr287/must-gather-4j4vh" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerName="copy" containerID="cri-o://ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399" gracePeriod=2 Oct 07 13:44:19 crc kubenswrapper[4677]: I1007 13:44:19.821318 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wr287/must-gather-4j4vh"] Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.146631 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wr287_must-gather-4j4vh_9703992d-5370-4c26-9afa-0e22239ab5e3/copy/0.log" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.147193 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.275045 4677 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wr287_must-gather-4j4vh_9703992d-5370-4c26-9afa-0e22239ab5e3/copy/0.log" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.275547 4677 generic.go:334] "Generic (PLEG): container finished" podID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerID="ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399" exitCode=143 Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.275595 4677 scope.go:117] "RemoveContainer" containerID="ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.275764 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr287/must-gather-4j4vh" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.296695 4677 scope.go:117] "RemoveContainer" containerID="fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.328003 4677 scope.go:117] "RemoveContainer" containerID="ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399" Oct 07 13:44:20 crc kubenswrapper[4677]: E1007 13:44:20.329664 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399\": container with ID starting with ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399 not found: ID does not exist" containerID="ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.329718 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399"} err="failed to get container status \"ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399\": rpc error: code = NotFound desc = could not find container \"ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399\": container with ID starting with ea94d9bbc90ff9807410889556f24946bf8f7a130d9711ad16087d9c6ac74399 not found: ID does not exist" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.329751 4677 scope.go:117] "RemoveContainer" containerID="fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a" Oct 07 13:44:20 crc kubenswrapper[4677]: E1007 13:44:20.330134 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a\": container with ID starting with fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a not found: ID does not exist" containerID="fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.330157 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a"} err="failed to get container status \"fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a\": rpc error: code = NotFound desc = could not find container \"fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a\": container with ID starting with fd80b5cbc6950c5f215066aee5aba45e090645834826a405747a70d19859890a not found: ID does not exist" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.337825 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9703992d-5370-4c26-9afa-0e22239ab5e3-must-gather-output\") pod \"9703992d-5370-4c26-9afa-0e22239ab5e3\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.338254 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzmxv\" (UniqueName: \"kubernetes.io/projected/9703992d-5370-4c26-9afa-0e22239ab5e3-kube-api-access-tzmxv\") pod \"9703992d-5370-4c26-9afa-0e22239ab5e3\" (UID: \"9703992d-5370-4c26-9afa-0e22239ab5e3\") " Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.346788 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9703992d-5370-4c26-9afa-0e22239ab5e3-kube-api-access-tzmxv" (OuterVolumeSpecName: "kube-api-access-tzmxv") pod "9703992d-5370-4c26-9afa-0e22239ab5e3" (UID: "9703992d-5370-4c26-9afa-0e22239ab5e3"). InnerVolumeSpecName "kube-api-access-tzmxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.406167 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9703992d-5370-4c26-9afa-0e22239ab5e3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9703992d-5370-4c26-9afa-0e22239ab5e3" (UID: "9703992d-5370-4c26-9afa-0e22239ab5e3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.439932 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzmxv\" (UniqueName: \"kubernetes.io/projected/9703992d-5370-4c26-9afa-0e22239ab5e3-kube-api-access-tzmxv\") on node \"crc\" DevicePath \"\"" Oct 07 13:44:20 crc kubenswrapper[4677]: I1007 13:44:20.439975 4677 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9703992d-5370-4c26-9afa-0e22239ab5e3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 13:44:21 crc kubenswrapper[4677]: I1007 13:44:21.311139 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" path="/var/lib/kubelet/pods/9703992d-5370-4c26-9afa-0e22239ab5e3/volumes" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.137424 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk"] Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.140932 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="extract-content" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.141054 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="extract-content" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.141146 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerName="gather" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.141230 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerName="gather" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.141311 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="extract-content" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.141389 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="extract-content" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.141506 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.141899 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.142261 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="extract-utilities" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.142371 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="extract-utilities" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.143584 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerName="copy" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.143612 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerName="copy" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.143651 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="extract-utilities" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.143665 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="extract-utilities" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.143680 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.143692 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.143707 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.143721 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.143736 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="extract-content" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.143747 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="extract-content" Oct 07 13:45:00 crc kubenswrapper[4677]: E1007 13:45:00.143762 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="extract-utilities" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.143773 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="extract-utilities" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.144048 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad3b0e3-5c0f-43f8-98ec-cae1e66b1cea" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.144070 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerName="gather" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.144087 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e4e5a2-3646-4573-a3cf-0212f31dc9d4" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.144099 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f37fe1-d7e4-4f42-b103-aa783f240b65" containerName="registry-server" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.144123 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703992d-5370-4c26-9afa-0e22239ab5e3" containerName="copy" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.144842 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.146821 4677 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.146952 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk"] Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.147128 4677 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.328397 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4ntf\" (UniqueName: \"kubernetes.io/projected/b9ddce74-f462-4793-a549-2ad23e668482-kube-api-access-b4ntf\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.328459 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9ddce74-f462-4793-a549-2ad23e668482-config-volume\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.328513 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9ddce74-f462-4793-a549-2ad23e668482-secret-volume\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.429845 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4ntf\" (UniqueName: \"kubernetes.io/projected/b9ddce74-f462-4793-a549-2ad23e668482-kube-api-access-b4ntf\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.429897 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9ddce74-f462-4793-a549-2ad23e668482-config-volume\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.429982 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9ddce74-f462-4793-a549-2ad23e668482-secret-volume\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.431939 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9ddce74-f462-4793-a549-2ad23e668482-config-volume\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.441062 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9ddce74-f462-4793-a549-2ad23e668482-secret-volume\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.448590 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4ntf\" (UniqueName: \"kubernetes.io/projected/b9ddce74-f462-4793-a549-2ad23e668482-kube-api-access-b4ntf\") pod \"collect-profiles-29330745-gwhtk\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.463228 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:00 crc kubenswrapper[4677]: I1007 13:45:00.694724 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk"] Oct 07 13:45:01 crc kubenswrapper[4677]: I1007 13:45:01.544500 4677 generic.go:334] "Generic (PLEG): container finished" podID="b9ddce74-f462-4793-a549-2ad23e668482" containerID="3b78557283e2812b1b06151e9504b6ce003ac8b3babd2bfd4822c5ddf71ec7a5" exitCode=0 Oct 07 13:45:01 crc kubenswrapper[4677]: I1007 13:45:01.544760 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" event={"ID":"b9ddce74-f462-4793-a549-2ad23e668482","Type":"ContainerDied","Data":"3b78557283e2812b1b06151e9504b6ce003ac8b3babd2bfd4822c5ddf71ec7a5"} Oct 07 13:45:01 crc kubenswrapper[4677]: I1007 13:45:01.545028 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" event={"ID":"b9ddce74-f462-4793-a549-2ad23e668482","Type":"ContainerStarted","Data":"7d30796013aabf285d806dff5e9b3441b1c779676f04a2ad191da988f21e8580"} Oct 07 13:45:02 crc kubenswrapper[4677]: I1007 13:45:02.776090 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:02 crc kubenswrapper[4677]: I1007 13:45:02.960464 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9ddce74-f462-4793-a549-2ad23e668482-secret-volume\") pod \"b9ddce74-f462-4793-a549-2ad23e668482\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " Oct 07 13:45:02 crc kubenswrapper[4677]: I1007 13:45:02.960558 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9ddce74-f462-4793-a549-2ad23e668482-config-volume\") pod \"b9ddce74-f462-4793-a549-2ad23e668482\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " Oct 07 13:45:02 crc kubenswrapper[4677]: I1007 13:45:02.960613 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4ntf\" (UniqueName: \"kubernetes.io/projected/b9ddce74-f462-4793-a549-2ad23e668482-kube-api-access-b4ntf\") pod \"b9ddce74-f462-4793-a549-2ad23e668482\" (UID: \"b9ddce74-f462-4793-a549-2ad23e668482\") " Oct 07 13:45:02 crc kubenswrapper[4677]: I1007 13:45:02.961825 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ddce74-f462-4793-a549-2ad23e668482-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9ddce74-f462-4793-a549-2ad23e668482" (UID: "b9ddce74-f462-4793-a549-2ad23e668482"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:45:02 crc kubenswrapper[4677]: I1007 13:45:02.967070 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ddce74-f462-4793-a549-2ad23e668482-kube-api-access-b4ntf" (OuterVolumeSpecName: "kube-api-access-b4ntf") pod "b9ddce74-f462-4793-a549-2ad23e668482" (UID: "b9ddce74-f462-4793-a549-2ad23e668482"). InnerVolumeSpecName "kube-api-access-b4ntf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:45:02 crc kubenswrapper[4677]: I1007 13:45:02.968579 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ddce74-f462-4793-a549-2ad23e668482-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9ddce74-f462-4793-a549-2ad23e668482" (UID: "b9ddce74-f462-4793-a549-2ad23e668482"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.062163 4677 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9ddce74-f462-4793-a549-2ad23e668482-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.062204 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4ntf\" (UniqueName: \"kubernetes.io/projected/b9ddce74-f462-4793-a549-2ad23e668482-kube-api-access-b4ntf\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.062219 4677 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9ddce74-f462-4793-a549-2ad23e668482-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.564363 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" event={"ID":"b9ddce74-f462-4793-a549-2ad23e668482","Type":"ContainerDied","Data":"7d30796013aabf285d806dff5e9b3441b1c779676f04a2ad191da988f21e8580"} Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.564415 4677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d30796013aabf285d806dff5e9b3441b1c779676f04a2ad191da988f21e8580" Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.564468 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-gwhtk" Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.839199 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m"] Oct 07 13:45:03 crc kubenswrapper[4677]: I1007 13:45:03.842055 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-zvg2m"] Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.238542 4677 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cng9f"] Oct 07 13:45:05 crc kubenswrapper[4677]: E1007 13:45:05.239321 4677 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ddce74-f462-4793-a549-2ad23e668482" containerName="collect-profiles" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.239351 4677 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ddce74-f462-4793-a549-2ad23e668482" containerName="collect-profiles" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.239649 4677 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ddce74-f462-4793-a549-2ad23e668482" containerName="collect-profiles" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.241305 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.252635 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cng9f"] Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.291117 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-utilities\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.291337 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dqx\" (UniqueName: \"kubernetes.io/projected/d1e54424-18f3-4674-8554-db47e45f4bc2-kube-api-access-z7dqx\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.291443 4677 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-catalog-content\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.314424 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84820c2-fd0b-4e52-801c-a70286d639de" path="/var/lib/kubelet/pods/b84820c2-fd0b-4e52-801c-a70286d639de/volumes" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.392141 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-utilities\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.392219 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dqx\" (UniqueName: \"kubernetes.io/projected/d1e54424-18f3-4674-8554-db47e45f4bc2-kube-api-access-z7dqx\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.392273 4677 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-catalog-content\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.392807 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-utilities\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.392889 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-catalog-content\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.415658 4677 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dqx\" (UniqueName: \"kubernetes.io/projected/d1e54424-18f3-4674-8554-db47e45f4bc2-kube-api-access-z7dqx\") pod \"redhat-marketplace-cng9f\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.580821 4677 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:05 crc kubenswrapper[4677]: I1007 13:45:05.795883 4677 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cng9f"] Oct 07 13:45:06 crc kubenswrapper[4677]: I1007 13:45:06.580140 4677 generic.go:334] "Generic (PLEG): container finished" podID="d1e54424-18f3-4674-8554-db47e45f4bc2" containerID="6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed" exitCode=0 Oct 07 13:45:06 crc kubenswrapper[4677]: I1007 13:45:06.580189 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cng9f" event={"ID":"d1e54424-18f3-4674-8554-db47e45f4bc2","Type":"ContainerDied","Data":"6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed"} Oct 07 13:45:06 crc kubenswrapper[4677]: I1007 13:45:06.580216 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cng9f" event={"ID":"d1e54424-18f3-4674-8554-db47e45f4bc2","Type":"ContainerStarted","Data":"69171b6df1847b1363ca00e7f9a5163bc125f53018da4beaf4a2b3ea1954abfa"} Oct 07 13:45:09 crc kubenswrapper[4677]: I1007 13:45:09.601019 4677 generic.go:334] "Generic (PLEG): container finished" podID="d1e54424-18f3-4674-8554-db47e45f4bc2" containerID="4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d" exitCode=0 Oct 07 13:45:09 crc kubenswrapper[4677]: I1007 13:45:09.601124 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cng9f" event={"ID":"d1e54424-18f3-4674-8554-db47e45f4bc2","Type":"ContainerDied","Data":"4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d"} Oct 07 13:45:11 crc kubenswrapper[4677]: I1007 13:45:11.617717 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cng9f" event={"ID":"d1e54424-18f3-4674-8554-db47e45f4bc2","Type":"ContainerStarted","Data":"2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11"} Oct 07 13:45:11 crc kubenswrapper[4677]: I1007 13:45:11.641390 4677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cng9f" podStartSLOduration=2.557804865 podStartE2EDuration="6.641358291s" podCreationTimestamp="2025-10-07 13:45:05 +0000 UTC" firstStartedPulling="2025-10-07 13:45:06.582311394 +0000 UTC m=+2278.068020509" lastFinishedPulling="2025-10-07 13:45:10.66586481 +0000 UTC m=+2282.151573935" observedRunningTime="2025-10-07 13:45:11.631897219 +0000 UTC m=+2283.117606374" watchObservedRunningTime="2025-10-07 13:45:11.641358291 +0000 UTC m=+2283.127067456" Oct 07 13:45:11 crc kubenswrapper[4677]: I1007 13:45:11.887181 4677 scope.go:117] "RemoveContainer" containerID="f538f6f357103da92c37795caa9efcf1541ba6dad60f92ce602a4c73ae66687d" Oct 07 13:45:15 crc kubenswrapper[4677]: I1007 13:45:15.581200 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:15 crc kubenswrapper[4677]: I1007 13:45:15.581742 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:15 crc kubenswrapper[4677]: I1007 13:45:15.620540 4677 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:15 crc kubenswrapper[4677]: I1007 13:45:15.676228 4677 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:15 crc kubenswrapper[4677]: I1007 13:45:15.846829 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cng9f"] Oct 07 13:45:17 crc kubenswrapper[4677]: I1007 13:45:17.654646 4677 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cng9f" podUID="d1e54424-18f3-4674-8554-db47e45f4bc2" containerName="registry-server" containerID="cri-o://2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11" gracePeriod=2 Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.102153 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.183544 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7dqx\" (UniqueName: \"kubernetes.io/projected/d1e54424-18f3-4674-8554-db47e45f4bc2-kube-api-access-z7dqx\") pod \"d1e54424-18f3-4674-8554-db47e45f4bc2\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.183636 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-catalog-content\") pod \"d1e54424-18f3-4674-8554-db47e45f4bc2\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.183752 4677 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-utilities\") pod \"d1e54424-18f3-4674-8554-db47e45f4bc2\" (UID: \"d1e54424-18f3-4674-8554-db47e45f4bc2\") " Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.184726 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-utilities" (OuterVolumeSpecName: "utilities") pod "d1e54424-18f3-4674-8554-db47e45f4bc2" (UID: "d1e54424-18f3-4674-8554-db47e45f4bc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.191225 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e54424-18f3-4674-8554-db47e45f4bc2-kube-api-access-z7dqx" (OuterVolumeSpecName: "kube-api-access-z7dqx") pod "d1e54424-18f3-4674-8554-db47e45f4bc2" (UID: "d1e54424-18f3-4674-8554-db47e45f4bc2"). InnerVolumeSpecName "kube-api-access-z7dqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.198466 4677 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1e54424-18f3-4674-8554-db47e45f4bc2" (UID: "d1e54424-18f3-4674-8554-db47e45f4bc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.285425 4677 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7dqx\" (UniqueName: \"kubernetes.io/projected/d1e54424-18f3-4674-8554-db47e45f4bc2-kube-api-access-z7dqx\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.285542 4677 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.285566 4677 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e54424-18f3-4674-8554-db47e45f4bc2-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.666566 4677 generic.go:334] "Generic (PLEG): container finished" podID="d1e54424-18f3-4674-8554-db47e45f4bc2" containerID="2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11" exitCode=0 Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.666638 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cng9f" event={"ID":"d1e54424-18f3-4674-8554-db47e45f4bc2","Type":"ContainerDied","Data":"2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11"} Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.666716 4677 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cng9f" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.667003 4677 scope.go:117] "RemoveContainer" containerID="2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.666981 4677 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cng9f" event={"ID":"d1e54424-18f3-4674-8554-db47e45f4bc2","Type":"ContainerDied","Data":"69171b6df1847b1363ca00e7f9a5163bc125f53018da4beaf4a2b3ea1954abfa"} Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.700584 4677 scope.go:117] "RemoveContainer" containerID="4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.721142 4677 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cng9f"] Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.724112 4677 scope.go:117] "RemoveContainer" containerID="6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.724219 4677 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cng9f"] Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.743107 4677 scope.go:117] "RemoveContainer" containerID="2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11" Oct 07 13:45:18 crc kubenswrapper[4677]: E1007 13:45:18.743511 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11\": container with ID starting with 2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11 not found: ID does not exist" containerID="2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.743551 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11"} err="failed to get container status \"2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11\": rpc error: code = NotFound desc = could not find container \"2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11\": container with ID starting with 2404fe2781091b28ac8e3502f86e329d2be580e6e8f6a8c38bd2716c38252a11 not found: ID does not exist" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.743576 4677 scope.go:117] "RemoveContainer" containerID="4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d" Oct 07 13:45:18 crc kubenswrapper[4677]: E1007 13:45:18.743795 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d\": container with ID starting with 4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d not found: ID does not exist" containerID="4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.743823 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d"} err="failed to get container status \"4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d\": rpc error: code = NotFound desc = could not find container \"4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d\": container with ID starting with 4c71ef028e35a905188989c3abbd336baf9a8e6a4d0b5b9fd73de38d12baf84d not found: ID does not exist" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.743839 4677 scope.go:117] "RemoveContainer" containerID="6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed" Oct 07 13:45:18 crc kubenswrapper[4677]: E1007 13:45:18.744018 4677 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed\": container with ID starting with 6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed not found: ID does not exist" containerID="6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed" Oct 07 13:45:18 crc kubenswrapper[4677]: I1007 13:45:18.744042 4677 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed"} err="failed to get container status \"6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed\": rpc error: code = NotFound desc = could not find container \"6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed\": container with ID starting with 6bc1eafb5bce9f7f58ad52393d3cb67a3e41c95afa563f33d40383b66fa256ed not found: ID does not exist" Oct 07 13:45:19 crc kubenswrapper[4677]: I1007 13:45:19.315716 4677 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e54424-18f3-4674-8554-db47e45f4bc2" path="/var/lib/kubelet/pods/d1e54424-18f3-4674-8554-db47e45f4bc2/volumes"